• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

HDR comparison shots

TSM

Member
I guess where I get confused is, if you can have a non wide color gamut display with hdr, and you can also have hdr that is mapped to rec 709, then what is hdr if not the combination of color and contrast?

HDR is just a marketing term for a new standard. That is all. HDR is not itself a thing. Basically HDR is existing standards with a new color space (REC 2020) combined with a lower limit of 10 bit color.
 

MazeHaze

Banned
The phrase was originally applied to this context in the sense of high contrast.

HDR10 signalling allows for high contrast with good precision, but only explicitly specifies a wider color range.

What HDR means depends on what you're referring to.
So in this context, hdr refers to a high contrast ratio, but the combination of wide color gamut with a high contrast gives the "hdr effect" that is currently much sought after in tvs?
 

MazeHaze

Banned
HDR is just a marketing term for a new standard. That is all. HDR is not itself a thing. Basically HDR is existing standards with a new color space (REC 2020) combined with a lower limit of 10 bit color.

But you can have hdr on an 8bit panel, and you can also have hdr in something that is mapped to rec709, that's why I always thought the high dynamic range refers to contrast ratio in this context.
 

Izuna

Banned
Color space has nothing to do with color depth. You can have 8 bit REC 2020.

I know.

I guess where I get confused is, if you can have a non wide color gamut display with hdr, and you can also have hdr that is mapped to rec 709, then what is hdr if not the combination of color and contrast?

What TSM said (I was proofing my post and scrolled up to see his haha)

But you can have hdr on an 8bit panel, and you can also have hdr in something that is mapped to rec709, that's why I always thought the high dynamic range refers to contrast ratio in this context.

You can't. HDR is really supposed to be short for HDR10, which is defined as having 10-bit colour mapped to Rec 2020. If it isn't that, then people are calling other shit HDR.

As this is a gaming forum I'd take it to only mean what Sony, MS and Netflix are referring to it as. If you see High Dynamic Range written in any other context, it's not what these UHD guys are talking about.

This is why, I suppose, that the Samsung TV claiming to be 8-bit and HDR is literal false advertising.
 

TSM

Member
But you can have hdr on an 8bit panel, and you can also have hdr in something that is mapped to rec709, that's why I always thought the high dynamic range refers to contrast ratio in this context.

HDR means high dynamic range. 8 bit color wouldn't qualify for HDR as it doesn't have the dynamic range to do so. HDR is referring to the expansion of color depth from 8 bit to 10 bit and the huge expansion of the color space going from REC 709 to REC 2020. These 2 changes do make for a hell of an upgrade to the existing standards. It sure beats the smaller move that could have been 10 bit REC 709.
 

MazeHaze

Banned
HDR means high dynamic range. 8 bit color wouldn't qualify for HDR as it doesn't have the dynamic range to do so. HDR is referring to the expansion of color depth from 8 bit to 10 bit and the huge expansion of the color space going from REC 709 to REC 2020. These 2 changes do make for a hell of an upgrade to the existing standards. It sure beats the smaller move that could have been 10 bit REC 709.

So then what are they talking about here?

http://www.creativeplanetnetwork.co...ide-color-gamut-and-high-dynamic-range/614877

According to these people on the panel of the nab post production world conference, hdr is high contrast ratio and seperate from wide color gamut.
 

Izuna

Banned

They're saying that you both Rec 2020 and 10-bit for HDR.

Not just 10-bit.

Exactly what TSM said...

rec2020-vs-rec709-001.png


This is your Wide Colour Gamut... But HDR10 promises using this as well as using 1.07b (10-bit) colours at once. Rather than using only 16.8m colours (8-bit).
 

HTupolev

Member
According to these people on the panel of the nab post production world conference, hdr is high contrast ratio and seperate from wide color gamut.
Yes, it can be taken that way.

"HDR" is associated with wide color gamut because HDR10 (and other specs), which was called "HDR" partly because it was intended to offer sufficient precision for high-contrast images to look good, also happens to provide for a wide color gamut.
 

MazeHaze

Banned
They're saying that you both Rec 2020 and 10-bit for HDR.

Not just 10-bit.

Exactly what TSM said...

rec2020-vs-rec709-001.png


This is your Wide Colour Gamut... But HDR10 promises using this as well as using 1.07b (10-bit) colours at once. Rather than using only 16.8m colours (8-bit).
But he specifically says that you can map hdr to rec 709, or you can take away hdr and go down to 100 nits and map rec2020. Listen to it again.

He says wcg and hdr just go hand in hand because the tech is available, but you can have one without the other.
 

Izuna

Banned
But he specifically says that you can map hdr to rec 709, or you can take away hdr and go down to 100 nits and map rec2020. Listen to it again.

He says wcg and hdr just go hand in hand because the tech is available, but you can have one without the other.

I'll simply what he's saying for you:

"You can absolutely do Rec. 709 [10-bit], and it'll look great. You can change the colour gamut to Rec. 2020 and then go back and do 100nits and you can do [Rec. 2020] [8-bit]."

In this talk, he means using Wide Colour Gamut with Rec. 2020 interchangeably. And by SDR and HDR he is talking about bit-depth.]
Or maybe he's referring to 12-bits, who knows. We used to call this Deep Colour right?


edit: nvm, HTupolev's version makes more sense.

Right now, HDR10, which is what this is all about now, refers to Rec. 2020 10-bit. He's basically explaining why High Dynamic Range assumed both are being done now.
 

HTupolev

Member
But he specifically says that you can map hdr to rec 709, or you can take away hdr and go down to 100 nits and map rec2020. Listen to it again.

He says wcg and hdr just go hand in hand because the tech is available, but you can have one without the other.
Context. They're obviously using HDR to refer specifically to luminance range, and WCG to refer to chromaticity range. This is perfectly reasonable usage; you can indeed have a high-luminance-contrast rec 709 image, or a low-luminance-contrast rec 2020 image.
 

MazeHaze

Banned
Context. They're obviously using HDR to refer specifically to luminance range, and WCG to refer to chromaticity range. This is perfectly reasonable usage; you can indeed have a high-luminance-contrast rec 709 image, or a low-luminance-contrast rec 2020 image.

See, but cobalt was interpreting hdr in this example to refer to 10 bit to support her argument that hdr "has nothing to do with brightness or contrast' and you're interpreting hdr in this example as meaning brightness/contrast. So who is right here?

Does hdr have nothing to do with brightness and contrast as cobalt has been arguing all day?
 

Izuna

Banned
See, but cobalt was interpreting hdr in this example to refer to 10 bit to support her argument that hdr "has nothing to do with brightness or contrast' and you're interpreting hdr in this example as meaning brightness/contrast. So who is right here?

Does hdr have nothing to do with brightness and contrast as cobalt has been arguing all day?

The context is that on NeoGAF, HDR refers to HDR10, since we care about it due to the Xbox One S and PS4 Pro announcement.

There are so many things called High Dynamic Range in the past.

And when you are talking about HDR content playing on your TV, via Netflix or a UHD Blu-ray, it's also referring to HDR10.
 

MazeHaze

Banned
The context is that on NeoGAF, HDR refers to HDR10, since we care about it due to the Xbox One S and PS4 Pro announcement.

There are so many things called High Dynamic Range in the past.
But in this specific example in the OP, we are refering to the displays themselves and hdt, not the game consoles. So to say "hdr has nothing to do with brightness and contrast" is incorrect.

Edit: in that panel they also refer to hdr as a high contrast ratio, seperate from color grading.
 

MazeHaze

Banned
Nobody, because everyone here is fully aware that this is a bizarre semantic argument, and yet we're choosing to have it anyway.

We all know what the video meant.
We all know what Cobalt meant.
Not when I'm being talked down to all day like I'm some idiot because I say that hdr can refer to a high contrast ratio, and doesn't necessarily only mean hdr10.

Edit:

My specific argument was that hdr can allow for more detail in bright areas of the screen due to the enhanced contrast, and cobalt argued that I'm wrong and hdr has nothing to do with contrast.
 
I've decided to cross post if that's okay. I feel like the thread title here brings different people in looking for the same info.

http://www.neogaf.com/forum/showthread.php?t=1276662

WHAT IS HDR?


  • 10-bit Colour
  • Rec. 2020


SO, WHAT DO WE SEE?




WHAT'S THE DEAL WITH THIS TALK OF MORE BRIGHTNESS?




A COMPARISON BETWEEN 8-BIT AND LOWER?



IS IT LIKE LIMITED VS. FULL COLOUR SPACE?




8-BIT? 24-BIT? 10-BIT? 30-BIT?




IS HDR10 JUST 10-BIT COLOUR THEN?




I WANT THE UNSIMPLIFIED EXPLANATION

--

According to Wikipedia:

The human eye can discriminate up to ten million colors.


And here's the reference:

D. B. Judd and G. Wyszecki (1975). Color in Business, Science and Industry. Wiley Series in Pure and Applied Optics (third ed.). New York: Wiley-Interscience. pp. 388. ISBN 0-471-45212-2.


So HDR no necessary??
 

Izuna

Banned
We really should all agree to... agree to stop this. We've all benefitted looking at more links and videos, we've sharpened our knowledge of HDR etc. regardless, this we are all pretty clear on what we all mean, regardless if some of us (me) prefer semantics to be as little as confusing as possible.

My HDR is better than your HDR.

- Really, I fully apologise if I sounded antagonistic or like I was insulting your intelligence. I was being overly stubborn and at the very least, it's rude considering you kept this conversation going. Really, it was a good debate, and I don't feel like it got heated. May our next disagreement be about something subjectively aparent so we can agree to disagree much quicker.

According to Wikipedia:

The human eye can discriminate up to ten million colors.


And here's the reference:

D. B. Judd and G. Wyszecki (1975). Color in Business, Science and Industry. Wiley Series in Pure and Applied Optics (third ed.). New York: Wiley-Interscience. pp. 388. ISBN 0-471-45212-2.


So HDR no necessary??

Can I tag someone else in?
 

J-Rzez

Member
The "problem" with most HDR TVs (at this time) is, that they support HDR. But they do not meet the full requirement of BT.2020. That mostly affect brightness, since these TVs can not reach 1000 nits.
TVs who have a 10bit Panel are really expensive (e.g. 65DXW904). OLED TVs have even more problems with HDR than normal TVs. Also most OLEDs have much more problems to reach that brightness (they are usually between 400-600 nits).

I hope that TV manufacturers will be able to release TVs with a 10bit panel cheaper in the near future. Until then I do not want to pay money for a "half HDR TV", even you can already a difference with them. But I want all ^__^ Until then I will save some more money to get an even more powerfull TV.

OLED doesn't need to hit 1000nits due to their contrast. Their standards are different.
 

androvsky

Member
We really should all agree to... agree to stop this. We've all benefitted looking at more links and videos, we've sharpened our knowledge of HDR etc. regardless, this we are all pretty clear on what we all mean, regardless if some of us (me) prefer semantics to be as little as confusing as possible.

My HDR is better than your HDR.



Can I tag someone else in?

I'm not sure I want to get into the previous argument, but I can take a stab at this one.

According to Wikipedia:

The human eye can discriminate up to ten million colors.


And here's the reference:

D. B. Judd and G. Wyszecki (1975). Color in Business, Science and Industry. Wiley Series in Pure and Applied Optics (third ed.). New York: Wiley-Interscience. pp. 388. ISBN 0-471-45212-2.


So HDR no necessary??
HDR as it's being used in regards to the PS4 Pro and UHD Blu-ray really refers to two different things.

One is brightness range, this is what's normally meant by HDR. Ignoring color, it allows a picture to have bright areas and dim areas while still having discernable detail in both. Standard displays only have 256 brightness levels, so there's not a lot of room to play with brightness. It's like how modern pop music CDs are all recorded at pretty much the same loudness, not allowing for very quiet sections or very loud sections like you'd get at a classical music concert.

The other part of HDR as a format (HDR10) is that it has a wider color gamut. This isn't a data thing so much as it requires displays to show colors they normally can't. Old TVs never really could show a good red, the best they could do is kind of an orange, with things like blood and wine being shown dark and with a too much blue to make it look like a deeper red. Even today on a typical LCD TV, red isn't terribly red, blue doesn't go into the indigo or violet, and green is pretty mild. With a wider color gamut, an artist can ask for the usual red that normal TVs can show, but also ask for a red that would just blow your mind, and show them in the same scene.

The extra bits required to do that does imply more colors than humans can discriminate, but don't forget there needs to be room in there for the brightness levels too. It's all about brightness levels and new colors, and trust me, TVs aren't anywhere near maxing out either.
 
4K alone is not the jump, It's 4K + HDR.

Yup.

What kind of comparison is this?? lmao

Tbh. i am not liking what i'm seeing. Specially not for the games. I always turn these kind of effects off if i can. Because it just looks bad imo. It's oversaturated and sky/lights usually look blown out.

This is generally what happens with HDR on a non-HDR display, especially an image looking incorrectly blown out. See HDR in person on a TV that supports it and HDR can be breathtaking. This is not like an artificial Vivid setting on TV that throws things off.

Can't show the benefits or HDR like this, as others have pointed out.
I doubt I'd want to have HDR in the long run in my home. I think TVs are bright enough already. Often end up turning my brightness way down over time. Feels better for the eyes.

Yeah, you need to see it on a TV that supports it and in a file format (or UHD disc) that brings it over.

This isn't a brightness that effects the entire image like "brightness", it makes the highlights and lighting/colors work better and so that you can properly have dark and light content really stand out right next to each other without it blowing out and losing detail, so this brightness makes for a more striking image rather than a blown out one.
 
I'm not sure I want to get into the previous argument, but I can take a stab at this one.


HDR as it's being used in regards to the PS4 Pro and UHD Blu-ray really refers to two different things.

One is brightness range, this is what's normally meant by HDR. Ignoring color, it allows a picture to have bright areas and dim areas while still having discernable detail in both. Standard displays only have 256 brightness levels, so there's not a lot of room to play with brightness. It's like how modern pop music CDs are all recorded at pretty much the same loudness, not allowing for very quiet sections or very loud sections like you'd get at a classical music concert.

The other part of HDR as a format (HDR10) is that it has a wider color gamut. This isn't a data thing so much as it requires displays to show colors they normally can't. Old TVs never really could show a good red, the best they could do is kind of an orange, with things like blood and wine being shown dark and with a too much blue to make it look like a deeper red. Even today on a typical LCD TV, red isn't terribly red, blue doesn't go into the indigo or violet, and green is pretty mild. With a wider color gamut, an artist can ask for the usual red that normal TVs can show, but also ask for a red that would just blow your mind, and show them in the same scene.

The extra bits required to do that does imply more colors than humans can discriminate, but don't forget there needs to be room in there for the brightness levels too. It's all about brightness levels and new colors, and trust me, TVs aren't anywhere near maxing out either.

I can understand the bright end of the spectrum, but isn't shadow detail driven by gamma? So if the gamma of the display does not change, how will you see better shadow detail with HDR? Also what happens when 48 bit color HDR comes along? Current displays will be obsolete, no? Seems like early adoption is just too risky.
 

androvsky

Member
I can understand the bright end of the spectrum, but isn't shadow detail driven by gamma? So if the gamma of the display does not change, how will you see better shadow detail with HDR? Also what happens when 48 bit color HDR comes along? Current displays will be obsolete, no? Seems like early adoption is just too risky.
Even with an aggressive gamma, there's just not a lot of room in 8 bits before the dark scene is really not that dark anymore. Human eyes can compensate like crazy for dark areas, so it's a bit of a problem when working on HDR content.

Another part of the standard is requiring TVs to suck less. You can't get black out of many TVs even these days, you're lucky to get a dark gray. It's getting better, but HDR really pushes adaptive backlighting and puts it to good use. To that point, current TVs can't quite handle the full current spec, so I'm not too worried about future ones. No, that doesn't speak well for early adoption, but for people in the market for a new TV anyway, even partial HDR support can be pretty nice (I have a 2015 Samsung 4K, I know what it's like).
 

MazeHaze

Banned
94y3hs9.webp

Vu7EzEu.webp

C9YNafo.webp

b5uru4B.webp

IxOxhAo.jpg

FXqjwpV.jpg


Just for fun I figured I'd post my own lol.

Same settings in hdr and sdr (back light maxed, brightness at default, same color warmth, etc)
 

Oblivion

Fetishing muscular manly men in skintight hosery
This is a stealth bragging thread by the op for being rich enough to own and watch two HDTVs at the same time.
 

DeeBatch

Member
Are all those from HDR enabled footage?



Also, yes, HDR is awesome. I've seen it many times, but now that both consoles are going in hard, I jumped in.



I have a Samsung HU8700 65" Curve 4K. While it doesn't have HDR out of the box, the Evolution Kit enables HDR for great results. Buying one that should be here next week.


Unfortunately, due to the lack of UHD in the Pro, I bought an XBO S for Blu Rays, and will wait on my PS4 Pro to see even more 4Kosh HDR goodness.


What a great time to be a graphics whore like myself...

I have the exact same Tv and the evolution kit. HDR looks fantastic on it and panel is 10 bits.
 

Izuna

Banned
.
Just for fun I figured I'd post my own lol.

Same settings in hdr and sdr (back light maxed, brightness at default, same color warmth, etc)

Just out of curiosity, did you change the HDR setting itself on your TV (probably called UHD color?)
 

MazeHaze

Banned
Just out of curiosity, did you change the HDR setting itself on your TV (probably called UHD color?)

No, you only use uhd color for hdmi sources, its an option on each individual input to accept an hdr signal. This is the built in amazon app. The way it works is when an hdr video starts playing, it auto switches to a seperate hdr mode. This mode cannot be manually engaged, but you can calibrate it same as any other mode when it is engaged, and it saves those settings.

Edit: to clarify, you can't turn hdr on and off when an hdr source is playing. There are simply two seperate versions of the show on amazon. An hd, and a uhd hdr version. I imagine if you have a uhd set but no hdr, it must play another seperate non hdr version, similar to netflix.
 

Izuna

Banned
No, you only use uhd color for hdmi sources, its an option on each individual input to accept an hdr signal. This is the built in amazon app. The way it works is when an hdr video starts playing, it auto switches to a seperate hdr mode. This mode cannot be manually engaged, but you can calibrate it same as any other mode when it is engaged, and it saves those settings.

Right, okay.

I don't have this myself so I apologise if this is too much to ask. Are you able to manually set your white balance etc. on whichever camera you are using? Or perhaps use something off-screen to use as your white balance?

In theory, doing this should make it impossible to see any additional discernable details between each image.
 

BONKERS

Member
Finally we get 10-bit color as some sort of standard and they resctrict it to 4k to push as a gimmick for sales.

Just furthers the brighter the bluer retail sales mentality.

1080p 10-bit HDR could still be very beneficial provided they get the shit adding more input lag in line.

But the TV industry is full of idiots.
 

MazeHaze

Banned
Right, okay.

I don't have this myself so I apologise if this is too much to ask. Are you able to manually set your white balance etc. on whichever camera you are using? Or perhaps use something off-screen to use as your white balance?

I could in a couple hours, I just left. You mean like hold up a white piece of paper or something?

These are simply quick photos taken on my phone and uploaded via imgur app. If you mean the white space on the tv itself, it's set to rtings calibration settings and set to carry over to all sources.
 

Izuna

Banned
I could in a couple hours, I just left. You mean like hold up a white piece of paper or something?

These are simply quick photos taken on my phone and uploaded via imgur app. If you mean the white space on the tv itself, it's set to rtings calibration settings and set to carry over to all sources.

Which phone are you using? So long as it isn't an iPhone it should be possible (afaik iPhone camera apps force automatic white balancing).

The issue with holding a white piece of paper is that it wouldn't be bright enough in that setting. You wouldn't have to try and find the perfect white balancing, just so long as it's consistent (like setting it so Fluorescent or Daylight, manually setting exposure etc.)

If it is an iPhone the only hope you would have is having a lamp and pressing where it is on the touch screen where it's about as bright as the television screen.

~

This is just curiosity from my point of view, no requirement to actually do so haha. I'm just assuming you wouldn't see anything similar to the OP's images.

10bit monitors have been around for years. Any of them that has good contrast and black levels should qualify as HDR.

Sadly, what we call HDR (and what these devices are trying to display) require more than just 10-bit colour. You have to wait for early 2017.
 

MazeHaze

Banned
Which phone are you using, so long as it isn't an iPhone it should be possible (afaik iPhone camera apps force automatic white balancing).

The issue with holding a white piece of paper is that it wouldn't be bright enough in that setting. You wouldn't have to try and find the perfect white balancing, just so long as it's consistent (like setting it so Fluorescent or Daylight, manually setting exposure etc.)

If it is an iPhone the only hope you would have is having a lamp and pressing where it is on the touch screen where it's about as bright as the television screen.

~

This is just curiosity from my point of view, no requirement to actually do so haha. I'm just assuming you wouldn't see anything similar to the OP's images.

I have an lg g3. The problem I had trying to take pictures is that on both sdr amd hdr some detail is lost because of how bright this tv gets, so I just tried to focus the shot best I could and as similar as I could.
 

TSM

Member
Just for fun I figured I'd post my own lol.

Same settings in hdr and sdr (back light maxed, brightness at default, same color warmth, etc)

If both of your HDTVs aren't calibrated to spec this is just a picture of two random HDTVs. Even selecting similar color temperatures is pointless because defaults are always way off. Then there is the fact that different types of back lights can give professional calibration equipment trouble never mind a consumer camera.
 

MazeHaze

Banned
If both of your HDTVs aren't calibrated to spec this is just a picture of two random HDTVs. Even selecting similar color temperatures is pointless because defaults are always way off. Then there is the fact that different types of back lights can give professional calibration equipment trouble never mind a consumer camera.

Those are both on the exact same tv, with the rtings calibrations for color and 2 point white. Close as I can get without professional equipment.
 

Octavia

Unconfirmed Member
Comparisons are pointless unless you have two tv's professionally calibrated so that they are giving nearly the same values in standard mode and you are seeing them in person.
 

Izuna

Banned
I have an lg g3. The problem I had trying to take pictures is that on both sdr amd hdr some detail is lost because of how bright this tv gets, so I just tried to focus the shot best I could and as similar as I could.

Ah okay. You should be able to set all the camera stuff manually in this case. Turn your ISO to like ~50 and turn off auto white balance. Try to find some reference point (bright part of the screen) and make them the same.

=)
 

MazeHaze

Banned
Ah okay. You should be able to set all the camera stuff manually in this case. Turn your ISO to like ~50 and turn off auto white balance. Try to find some reference point (bright part of the screen) and make them the same.

=)
Cool cool,will try when I get home.
 
Top Bottom