• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

HDR comparison shots

MazeHaze

Banned
ijcagCv.png

This image is taken from a slide from nVidia

Look at where it shows HDR Decoder. When we stream HDR content to our TV, or the Xbox One S tries to send the signal, what it wants to know is whether or not the screen is okay to receive it. HDMI 2.0a (the port, not a cable) allows us to send over additional information to check this (similar to how DRM works, it will block you if it doesn't like the screen).

Back to the requirements for HDR Media Profile:
[/list]

1. aka the Perceptual Quantize, we need to be able to display Rec. 2020 -- which is the wider colour gamut. This alone doesn't mean more colours, it means it can pick out colours some this space. This also allowed the content to display luminance levels of up to 10,000 cd/m2 -- not that it does, it's just that it can (nothing does, literally no TV does this)

2. The 10-bit Monitor has this, so that's fine.

3. The 10-bit Monitor can have this also. No worries.

4. Now here's another issue. These 10-bit monitors are not going to know what to do with this. The port is either not HDMI 2.0a so they can't carry it, or it simply isn't created to verify it can accept the metadata (it can't).

And we also have the problem with a decoder, also. Nvidia's solution here is to use Nvidia Shield which has an HDR decoder. These monitors don't have this. It's specific software that's part of why the Xbox One S doesn't allow HDR content to it.

Also, I hope you're aware that there are 10-bit monitors with higher contrast ratios than TVs. It doesn't matter if they don't accept the signal. And if they don't accept the signal, they aren't displaying anything.

Early 2017 is when they should start to.
That's all fime, but speaking about hdr tvs, brightness is part of the standard. I don't understand why you say brightness has nothing to do with it, while every website you ask says it's essential.

Heres another from 4k.com

http://4k.com/high-dynamic-range-4k...-contrast-wide-color-gamut-tvs-content-offer/

"Currently, the “ideal” HDR standard that key players are pushing for would involve a dynamic range of 0 to 10,000 nits, which would really bring 4K TVs close to what real life looks like (the sky on a sunny day offers about 30,000 nits of brightness to the naked eye). However, in practical reality, even the latest HDR standards for premium 4K ultra HD TVs cover only 0.05 to 1100 nits



In very basic terms, HDR is the ability to expand the different stops of both bright and dark levels in a 4K TV for a wider, richer range of colors, much brighter, more realistic whites and much deeper, richer darks, all being manifested at the same time on the same display as needed. With this, a TV display takes on a more “dynamic” look and ultimately gives the content a viewer is looking at a far more vibrant and realistic appearance."

Is 4k.com wrong too? You told me to show you any source that says hdr has anything to do with brightness. I show you multiple sources and you just say theyre all bad and wrong.
 

TSM

Member
HDR is just marketing talk for a combination of features that define a new standard.

10 bit color works perfectly fine with SDR. 10 bit color just provides more granularity. This gives us more steps of luminosity for Red, Green and Blue.

Similarly contrast ratio works exactly the same whether it's HDR or SDR.

Gamma curve is the same.

White point is the same D65 (.313, .329) that SDR uses.

Where we start to deviate is the color gamut. The big difference is that the color of red green and blue have been shifted significantly. This gives us exponentially more colors to work with.



SDR:
8 bit color minimum
Rec 709
D65 white point
2.2 gamma

HDR:
10 bit color minimum
Rec 2020
D65 white point
2.2 gamma

As you can see the only sgnificant change is the change in color gamut. Other changes are merely technical like metadata.

That's all fime, but speaking about hdr tvs, brightness is part of the standard. I don't understand why you say brightness has nothing to do with it, while every website you ask says it's essential.

Heres another from 4k.com

http://4k.com/high-dynamic-range-4k...-contrast-wide-color-gamut-tvs-content-offer/

"Currently, the “ideal” HDR standard that key players are pushing for would involve a dynamic range of 0 to 10,000 nits, which would really bring 4K TVs close to what real life looks like (the sky on a sunny day offers about 30,000 nits of brightness to the naked eye). However, in practical reality, even the latest HDR standards for premium 4K ultra HD TVs cover only 0.05 to 1100 nits



In very basic terms, HDR is the ability to expand the different stops of both bright and dark levels in a 4K TV for a wider, richer range of colors, much brighter, more realistic whites and much deeper, richer darks, all being manifested at the same time on the same display as needed. With this, a TV display takes on a more “dynamic” look and ultimately gives the content a viewer is looking at a far more vibrant and realistic appearance."

Is 4k.com wrong too? You told me to show you any source that says hdr has anything to do with brightness. I show you multiple sources and you just say theyre all bad and wrong.

You don't understand how dynamic range works. That same contrast ratio on a SDR display gains you every last bit of advantage that it does in a HDR display. They both use the same gamma curve so they both see similar results. The difference is that the optional spec that you keep mentioning is just that, optional. A rising tide lifts all boats. An improvement in contrast ratio will improve SDR just as much as HDR. Similary the same change in color bit depth (8 bit, 10 bit, 12 bit, etc) gains you the same advantages wether it's SDR or HDR.
 

SebastianM

Member
No, because what you see on the right IS showing on your non-HDR screen. Literally EVERYTHING you see on a non-HDR display is capable without HDR...

What I mean is the way both have an impact on the camera lens, you can tell the SDR screen compromises brightness over detail.
 

Izuna

Banned

I fully understand the hiccup you're making. Every time you go to one of those websites, you keep being referred to Ultra HD Premium standard.

In an earlier post I showed that Panasonic neglects this entirely, and just sticks HDR10 on their screen instead. This is because their mid-range screens that do support HDR (you can play an Xbox One S into it) don't meet those requirements you keep quoting.

HDR10 was never supposed to force any contrast ratio. If it did, that would be a problem since you wouldn't be able to easily detect this anyway. OLED screens for example have much lower nits for blacks, but no where near go as bright despite being HDR. You see this all the time when you quote the UHD Premium standard.

There are 10-bit OLED Monitors, which much better contrast ratio than your own screen, that don't support HDR10.

Since HDR10 is a an open platform... and honestly I can't wait until you actually see it, there will be HDR10 screens with terrible contrast ratios.

The talk of better brightness is because this is part of the purpose of HDR10 in the first place. It's not a side-effect, it's an opportunity. We never cared about showing such bright colours because they were never available with Rec. 709 standard. Now that we can, and we are getting 10-bit content, it's benefits are being displayed on equipment with really good contrast ratios.

Those comparison pictures, especially the ones in the OP, do not speak for HDR10. This is where I entered this thread to begin with. All the benefits you described are not an effect of 10-bit colour, or "HDR", but rather the fact that the screens themselves are just brighter etc. If OP put an OLED next to his old TV he would not be able to have his HDR10 screen to appear brighter than a run-of-the-mill 4K SDR television.

Everything you see in those comparison pictures have nothing to do with HDR, and are instead being encapsulated into the marketing term.

So once again, this is what you're talking about:

119129-114063-i_rc.jpg


NOT HDR. HDR exists outside of that standard. The part where you're getting confused is because that sticker requires HDR10.

What I mean is the way both have an impact on the camera lens, you can tell the SDR screen compromises brightness over detail.

What's happening is that he has two different TVs, one that is brighter and one that isn't...
 

TSM

Member
What I mean is the way both have an impact on the camera lens, you can tell the SDR screen compromises brightness over detail.

The only real change between the two standards is the color gamut, so that's what manufacturers will try and demonstrate. A SDR is every bit as bright if it is manufactured with the same contrast ratio. Even 10 bit color only gains you better granularity. It in itself does not make an image more or less bright. People are so fixated on the new HDR standard putting an emphasis on creating displays with higher contrast ratios that they don't realize that this will also apply to all displays. Non HDR sets will also gain the same benefits as HDR sets as the contrast ratios improve.
 

MazeHaze

Banned
Right, but my post you argued with in the first place was that the hdr screen allowed for a brighter image without sacrificing detail in bright areas due to the enhanced contrast level. You told me the brightness had nothing to do with the hdr screen, and that's false. This thread wasn't about hdr10, it's about an hdr display vs a non hdr display.

So should I believe a condescending neogaf poster, or the multitude of websites and my own eyes when personally viewing things on my own hdr tv?
 
All the comments bashing HDR based on completely illogical photo/graphic comparisons are majorly off the mark.

Go to a store and see it in action. It's truly amazing. The most impressive jump in display tech since I saw a plasma tv for the first time, way back when.
 

TSM

Member
Right, but my post you argued with in the first place was that the hdr screen allowed for a brighter image without sacrificing detail in bright areas due to the enhanced contrast level. You told me the brightness had nothing to do with the hdr screen, and that's false. This thread wasn't about hdr10, it's about an hdr display vs a non hdr display.

So should I believe a condescending neogaf poster, or the multitude of websites and my own eyes when personally viewing things on my own hdr tv?

The only difference between an HDR screen and a non HDR screen is the color of the primary illuminants (red, green and blue). There isn't any secret new technology being deployed in the production of the actual screens that exists in one, but not the other. I think the big difference that will be seen initially is that manufacturers are pushing these as their high end sets so there will be more money put into the screens of HDR displays for the short term and they will have that advantage.
 

MazeHaze

Banned
The only difference between an HDR screen and a non HDR screen is the color of the primary illuminants (red, green and blue). There isn't any secret new technology being deployed in the production of the actual screens that exists in one, but not the other. I think the big difference that will be seen initially is that manufacturers are pushing these as their high end sets so there will be more money put into the screens of HDR displays for the short term and they will have that advantage.

Ok, but I was specifically talking about the display posted in this thread. I would also assume that going forward, all tvs with a 1000: .05 contrast ratio would be 10 bit panels, and therefor hdr.
 

NeoTracer

Neo Member
All the comments bashing HDR based on completely illogical photo/graphic comparisons are majorly off the mark.

Go to a store and see it in action. It's truly amazing. The most impressive jump in display tech since I saw a plasma tv for the first time, way back when.

Yeah, it is seriously stunning. For everyone who hasn't experience HDR, you would have to see with your own eyes and not rely on pictures. And if it isn't impressive, well at least you saw it for yourself
 

Izuna

Banned
Right, but my post you argued with in the first place was that the hdr screen allowed for a brighter image without sacrificing detail in bright areas due to the enhanced contrast level. You told me the brightness had nothing to do with the hdr screen, and that's false. This thread wasn't about hdr10, it's about an hdr display vs a non hdr display.

So should I believe a condescending neogaf poster, or the multitude of websites and my own eyes when personally viewing things on my own hdr tv?

I'm not being condescendingly purposefully.

I'm saying that it has nothing to do with contrast ratio. The 10-bit colour is why you can get a brighter image with detail. HDR doesn't make screens brighter. And any picture posted, showcasing any differences, will never show any benefit of HDR -- because if there is detail in one screen or the other, it is achieveable on our SDR displays -- including the intense contrast ratio.

You asked why monitors aren't HDR. I explained why.

The write up for the BVM-X300 is my best explanation but it's extremely complicated (it is an HDR Monitor, but Professional, it's what they use to produce HDR content etc.)

That's all there is, really.

Ok, but I was specifically talking about the display posted in this thread. I would also assume that going forward, all tvs with a 1000: .05 contrast ratio would be 10 bit panels, and therefor hdr.

That's optimistic but I doubt it honestly. We will see a push for HDR, but we will not have that contrast ratio. Panasonic isn't even achieving on their already released HDR displays.
 

MazeHaze

Banned
I'm not being condescendingly purposefully.

I'm saying that it has nothing to do with contrast ratio. The 10-bit colour is why you can get a brighter image with detail. HDR doesn't make screens brighter. And any picture posted, showcasing any differences, will never show any benefit of HDR -- because if there is detail in one screen or the other, it is achieveable on our SDR displays -- including the intense contrast ratio.

You asked why monitors aren't HDR. I explained why.

The write up for the BVM-X300 is my best explanation but it's extremely complicated (it is an HDR Monitor, but Professional, it's what they use to produce HDR content etc.)

That's all there is, really.


What does the "dynamic range" in hdr stand for then? Dynamic range of what? If not contrast and color, please explain.

Edit: im my personal experience, hdr content has brighter brights against fuller darks, and more detail in dark scenes. So how is brightness not a part of it?
 

TSM

Member
Ok, but I was specifically talking about the display posted in this thread. I would also assume that going forward, all tvs with a 1000: .05 contrast ratio would be 10 bit panels, and therefor hdr.

My HDTV had a 10 bit screen back in the mid 2000's. Almost all processing on board standard HDTVs are at least 10 bit if not 12 bit. 10 bit screens are nothing new. The big difference is that we finally have a 10 bit source to feed these HDTVs. What you really want if you have a 10 bit source is to have at least a 12 bit display. This way when you calibrate the gamma you have enough granularity to display it all. I'd be surprised if all these new HDR tvs aren't processing the image with a minimum of 12 bits of precision.

Contrast ratio is a separate issue. They could have mandated something similar for a new spec for REC 709 displays and it'd have given similarly impressive results.
 

Gbraga

Member
I think the best way to explain HDR is with this example:

"Say you are taking a picture of a red wagon. Current TVs can choose from 256 shades of that red color to show all the highlights and shadows. HDR TVs can use 4096 shades of that color to show even more nuanced color variation. It is like having a larger box of crayons with which to draw the image."

How is that?

MUCH better than any bullshit comparison that either gimps the "SDR image", or oversaturates the "HDR image".

Thanks for finally making me understand what the hell it even is.

I don't doubt it's a big difference, but the TVs are so expensive (and some pretty shitty models too) that I'd rather not even experience it for myself anytime soon :p

Also never tried 120~144hz for the same reason. Why just make 60fps not seem enough when I can't keep up with the hardware required to run new games at 144fps, or have the money to afford a good 144hz monitor.

I'm happy with my current image, but I'll definitely keep HDR in mind when my TV stops working for whatever reason and I'm forced to get another.

1080p Panasonic Plasma still treating me very well.
 
As an owner of a 4k tv without HDR, I'm going to wait for OLED TVs to have better response times / for the ones with decent response times to drop in price by like 25-30%.

Im jelly over HDR, but have no issues with my current TV at all.

Breh, that's what I thought...as I also have a nice 60 + inch Samsung UHD w/ no HDR, until I saw HDR capable TV at a friend's house (calibrated to boot)...

Fuuuuuucck me it looks nice.

It can and does, people like to shitpost

On titles less graphically intensive, yes - native 2160p. On streaming content like Netflix, video on demand, youtube etc., (hopefully MLB.TV too) yes.

On more graphically intensive games like Horizon it does a native resolution in between 1440p - 2160p, (digital foundry said something around native 1800p) and uses a good upscaling method to get a nice 2160p (4k) image. You should watch digital foundry's impressions on it.
 

TSM

Member
I think the best way to explain HDR is with this example:

"Say you are taking a picture of a red wagon. Current TVs can choose from 256 shades of that red color to show all the highlights and shadows. HDR TVs can use 4096 shades of that color to show even more nuanced color variation. It is like having a larger box of crayons with which to draw the image."

How is that?

It doesn't work like that. To attempt to make that shade of red with 8 bit color you have 255 shades of each of red, green and blue (0 is black, and with video standards you only get steps 16 - 235 to make that shade). Odds are it's impossible to make that shade of red though because it doesn't even exist in the REC 709 color gamut.

That's what HDR displays are bringing to the table. That shade of red might now fall in the REC 2020 gamut. Then you have more granularity (10 bit color) to try and hit that shade of red.
 

Izuna

Banned
What does the "dynamic range" in hdr stand for then? Dynamic range of what? If not contrast and color, please explain.

Edit: im my personal experience, hdr content has brighter brights against fuller darks, and more detail in dark scenes. So how is brightness not a part of it?

It's marketing bullshit. High Dynamic Range (HDR) can mean whatever these TV companies want it to mean, but as far as Netflix, Xbox One S, UHD Blu-ray etc. are concerned, it means more colour.

They are mutually exclusive.

If you took that content and put it on an SDR display, just like those images in the link I sent you, the brights part and darkest parts will still have the same nits (if I am allowed to simplify), and you're that there just wouldn't be as much detail.

It's like this ~ picture a scene with a flame in the middle and a grey coat in the background.

HDR Screen with Good Contrast Ratio = You can see the red in the flame, and the grey coat is visible.

HDR Screen with Bad Contrast Ratio, but on high brightness = You can't see the red in the flame, and the grey coat is visible.

HDR Screen with Bad Contrast Ratio, but on lower brightness = You can see the red in the flame, and the grey cost is invisible.

---

SDR Screen with Good Contrast Ratio = You can't see the red in the flame, and the grey coat is visible.

SDR Screen with Bad Contrast Ratio = You can't see the red in the flame, and the grey cost is not visible.

---

I hope you can separate these...

It doesn't work like that. To attempt to make that shade of red with 8 bit color you have 255 shades of each of red, green and blue (0 is black, and with video standards you only get steps 16 - 235 to make that shade). Odds are it's impossible to make that shade of red though because it doesn't even exist in the REC 709 color gamut.

That's what HDR displays are bringing to the table. That shade of red might now fall in the REC 2020 gamut. Then you have more granularity (10 bit color) to try and hit that shade of red.

You can make it with dithering =P

6-bit displays have been kicking ass for the longest time.
 

MazeHaze

Banned
It's marketing bullshit. High Dynamic Range (HDR) can mean whatever these TV companies want it to mean, but as far as Netflix, Xbox One S, UHD Blu-ray etc. are concerned, it means more colour.

They are mutually exclusive.

If you took that content and put it on an SDR display, just like those images in the link I sent you, the brights part and darkest parts will still have the same nits (if I am allowed to simplify), and you're that there just wouldn't be as much detail.

It's like this ~ picture a scene with a flame in the middle and a grey coat in the background.

HDR Screen with Good Contrast Ratio = You can see the red in the flame, and the grey coat is visible.

HDR Screen with Bad Contrast Ratio, but on high brightness = You can't see the red in the flame, and the grey coat is visible.

HDR Screen with Bad Contrast Ratio, but on lower brightness = You can see the red in the flame, and the grey cost is invisible.

---

SDR Screen with Good Contrast Ratio = You can't see the red in the flame, and the grey coat is visible.

SDR Screen with Bad Contrast Ratio = You can't see the red in the flame, and the grey cost is not visible.

---

I hope you can separate these...



You can make it with dithering =P

6-bit displays have been kicking ass for the longest time.
I guess I'm just confused I guess?

Can you explain howcome when I watch sdr content, the conrast ratio is great, stunning even, but when I watch something like cosmos laundromat, I am blown away by the way a silhouette can look against a bright sky, something I don't see in any sdr content on the same set.
 

HTupolev

Member
It's like this ~ picture a scene with a flame in the middle and a grey coat in the background.

HDR Screen with Good Contrast Ratio = You can see the red in the flame, and the grey coat is visible.

SDR Screen with Good Contrast Ratio = You can't see the red in the flame, and the grey coat is visible.
Assuming the signals are mapped to the same luminance range, this isn't right. If red is straight-up desaturated to yellow/white then that's a result of a luminance blow-out in all the color channels. A wider color spectrum won't fix this, because having a red enough red isn't the problem. What'll fix it is a better contrast ratio that can allow for really bright reds to be displayed.
 

TSM

Member
You can make it with dithering =P

6-bit displays have been kicking ass for the longest time.

Unfortunately you can't hit any color that doesn't exist in the displays available color gamut. A display is only capable of producing colors that fall within it's triangle. There's no trick you can do to produce colors outside of this gamut. You are limited by whatever colors of red, green and blue the manufacturer produced the screen with, and the back light has to be capable of producing a light capable of illuminating those colors.
 

HTupolev

Member
Unfortunately you can't hit any color that doesn't exist in the displays available color gamut. A display is only capable of producing colors that fall within it's triangle. There's no trick you can do to produce colors outside of this gamut. You are limited by whatever colors of red, green and blue the manufacturer produced the screen with, and the back light has to be capable of producing a light capable of illuminating those colors.
Right. In REC 709 color space, dithering between (128,128,0) and (128,128,1) will give you an impression of an intermediate yellow tone that's basically (128,128,.5). But you can't use such methods to get a tone that's, say, more yellow than (128,128,0). You need to define a wider color space to do that, and that's what the REC 2020 color space is about.
 

McSpidey

Member
I get the feeling people who have been running their existing SD or even HDR displays in uncalibrated wide colour gamut + torch mode brightness + game mode (...perhaps the vast majority of gamers, especially), despite not having wide colour/HDR sources, are going to be the most confused when seeing HDR content.
 
I get the feeling people who have been running their existing SD or even HDR displays in uncalibrated wide colour gamut + torch mode brightness + game mode (...perhaps the vast majority of gamers, especially), despite not having wide colour/HDR sources, are going to be the most confused when seeing HDR content.

On a lot of TV's putting on game mode doubles up as putting it on torch mode too. Sometimes they'll force overscan to zoom in on the image as well.

I hate it when people put it on game mode and call it. No, put it on game mode, then go back and adjust color/sharpness/brightness/contrast to reasonable levels.
 

Izuna

Banned
Right. In REC 709 color space, dithering between (128,128,0) and (128,128,1) will give you an impression of an intermediate yellow tone that's basically (128,128,.5). But you can't use such methods to get a tone that's, say, more yellow than (128,128,0). You need to define a wider color space to do that, and that's what the REC 2020 color space is about.

It's pretty exciting to finally be upgrading from 8-bit... We should never go back.
 

Izuna

Banned
Assuming the signals are mapped to the same luminance range, this isn't right. If red is straight-up desaturated to yellow/white then that's a result of a luminance blow-out in all the color channels. A wider color spectrum won't fix this, because having a red enough red isn't the problem. What'll fix it is a better contrast ratio that can allow for really bright reds to be displayed.

I feel like I included this in my example. But of course it's an oversimplification. We will, for sure, get HDR10 displays that have terrible contrast ratios unless the industry suddenly no longer cares about making trash displays.
 

MazeHaze

Banned
I feel like I included this in my example. But of course it's an oversimplification. We will, for sure, get HDR10 displays that have terrible contrast ratios unless the industry suddenly no longer cares about making trash displays.
There are already hdr10 displays with poor contrast ratios, and hdr doesn't look impressive on them. That's why a high dynamic range of contrast is necessary.

Source:. I owned one before upgrading to my current display.
 

Izuna

Banned
There are already hdr10 displays with poor contrast ratios, and hdr doesn't look impressive on them. That's why a high dynamic range of contrast is necessary.

Source:. I owned one before upgrading to my current display.

...

Omg I'm being trolled
 

MazeHaze

Banned
...

Omg I'm being trolled

No?

Edit:

I owned the sony 850d. It gets super bright and has wide color gamut, unfortunately the ips panel doesn't get dark enough, so instead of popping, hdr content looks washed out.

I now own the ks8000 and hdr content is amazing and all the bright highlights and silhouettes in hdr content are mesmerizing.
 

Izuna

Banned
No?

Edit:

I owned the sony 850d. It gets super bright and has wide color gamut, unfortunately the ips panel doesn't get dark enough, so instead of popping, hdr content looks washed out.

I now own the ks8000 and hdr content is amazing and all the bright highlights and silhouettes in hdr content are mesmerizing.

I believe you. So why were you saying that HDR has anything to do with contrast ratio? You know yourself that HDR content alone doesn't offer what you're asking.

High dynamic range of contrast ratio is a term you have made up, just now.

Again, what you're looking for is UHD Premium (which dictates contrast ratio, and is merely a standard), HDR honestly just refers to PQ and 10bit colour (and metadata).

I've been repeating that in every single post since the very beginning, and all this time you had an HDR display that didn't do what you were saying HDR WAS doing...

Just... nevermind. Really I don't win anything here, I'll accept that I'm wrong if it means the end.
 

Kibbles

Member
Whenever I try playing back HDR footage on my computer on my HDR projector it looks washed out. I wish this wasn't a pain in the ass to setup and shit just worked >_> No idea what's wrong.

Also have no idea how to setup my shield to output HDR to my HDR TV. Settings grayed out at 4:2:0 or whatever *sigh*
 

MazeHaze

Banned
I believe you. So why were you saying that HDR has anything to do with contrast ratio? You know yourself that HDR content alone doesn't offer what you're asking.

High dynamic range of contrast ratio is a term you have made up, just now.

Again, what you're looking for is UHD Premium (which dictates contrast ratio, and is merely a standard), HDR honestly just refers to PQ and 10bit colour (and metadata).

I've been repeating that in every single post since the very beginning, and all this time you had an HDR display that didn't do what you were saying HDR WAS doing...

Just... nevermind. Really I don't win anything here, I'll accept that I'm wrong if it means the end.

Because my semantics were wrong I guess? When saying hdr display I guess I meant a good hdr display that meets uhd alliance standards. Buying a bad hdr display to take advantage of hdr would be stupid. In this case, contrast ratio in an hdr capable display is extremely important, as I have owned an hdr display with poor contrast, and it wasn't very impressive at all. If you want an hdr display, don't even buy one that doesn't have good contrast. I'd take a nice sdr plasma over a shit hdr display any day.

See, this is why, again just from my personal experience, hdr isn't just about more color detail, you need an awesome contrast ratio to truely take advantage of the wide color gamut.
 

MazeHaze

Banned
Whenever I try playing back HDR footage on my computer on my HDR projector it looks washed out. I wish this wasn't a pain in the ass to setup and shit just worked >_> No idea what's wrong.

Also have no idea how to setup my shield to output HDR to my HDR TV. Settings grayed out at 4:2:0 or whatever *sigh*

Depending on the tv, there is a setting to change the hdmi port. It's usually called true deep color, or uhd color, or something similar. This needs to be turned on to accept a 4k 444 signal or an hdr signal.
 

HTupolev

Member
I feel like I included this in my example.
You said that an HDR display with good contrast would be able to see red in a flame whose colors are blown out on an SDR display with good contrast.

If the lack of redness in the SDR display is due to all color channels being blasted high to achieve preserve brightness, you'd have basically the same issue with an HDR display of comparably good contrast.

Where HDR would really help a lot is when the lack of redness is due to a display not being able to display a red enough red.

What a blown-out flame really needs is more contrast.
 
After seeing this I'm just growing more tempted to splurge and get a monitor or TV with HDR...

I've been happy with my 1080p 40" Sony Bravia so far, but if I'm missing out on this...
 
Some of these HDR photos which is blending multiple exposures are not good at selling HDR for television. They might look nice for a moment, flickr is full of them but they do a disservice as do long exposure times.

Forget the souped up HDR photos where everything is aglow, nothing is new in those photos.

Think of a red car in real life, a digital 8bit screen will have trouble actually displaying the correct colour no matter how many exposures you take, it will be close and also won't contain some of the gradients and light bounce. Imagine an 1990s digitized version with 128 colours and how rough it is, that's what current televisions are like, a rough approximation, then you have the white and black range and how out of control these are, we're so used to seeing them on screen, you automatically know its video. A HDR TV will be more like looking out of a window rather than a screen with video limitations and imbalances.
 

Izuna

Banned
You said that an HDR display with good contrast would be able to see red in a flame whose colors are blown out on an SDR display with good contrast.

If the lack of redness in the SDR display is due to all color channels being blasted high to achieve preserve brightness, you'd have basically the same issue with an HDR display of comparably good contrast.

Where HDR would really help a lot is when the lack of redness is due to a display not being able to display a red enough red.

What a blown-out flame really needs is more contrast.

I'm assuming a very bright red that belongs outside of Rec 709. It was a simplified example.

HDR is like Digital Vibrance Control by NVIDIA but not but kind of.

hurr hurr
 

HTupolev

Member
I'm assuming a very bright red that belongs outside of Rec 709. It was a simplified example.
You're blending brightness and saturation.

If it's simply a matter of a red being outside of REC 709, and the similar-contrast HDR display could handle it correctly, an SDR display would still be able to display it red, just not quite as red.
 

TheYanger

Member
See, here I don't really like. Already felt this in the Horizon demo but it seems like a blue and orange enhancer.



Night and day difference.

damn it, late on the joke

The thing is, you can't look at it on a non-hdr monitor and say "I don't like it" because you can't really see it. It's unfortunate, because it looks AMAZING in person.
 

Izuna

Banned
You're blending brightness and saturation.

If it's simply a matter of a red being outside of REC 709, and the similar-contrast HDR display could handle it correctly, an SDR display would still be able to display it pretty red; just not as red.

I mean this is a relatively pointless hypothetical that doesn't exist in the real world. I just meant to try and separate contrast and wider colour gamut.

Regardless, I feel like there is enough information here.
 

MazeHaze

Banned
I mean this is a relatively pointless hypothetical that doesn't exist in the real world. I just meant to try and separate contrast and wider colour gamut.

Regardless, I feel like there is enough information here.

I guess where I get confused is, if you can have a non wide color gamut display with hdr, and you can also have hdr that is mapped to rec 709, then what is hdr if not the combination of color and contrast?
 

HTupolev

Member
I guess where I get confused is, if you can have a non wide color gamut display with hdr, and you can also have hdr that is mapped to rec 709, then what is hdr if not the combination of color and contrast?
The phrase was originally applied to this context in the sense of high contrast.

HDR10 signalling allows for high contrast with good precision, but only explicitly specifies a wider color range.

What HDR means depends on what you're referring to.
 
Top Bottom