That's all fime, but speaking about hdr tvs, brightness is part of the standard. I don't understand why you say brightness has nothing to do with it, while every website you ask says it's essential.
This image is taken from a slide from nVidia
Look at where it shows HDR Decoder. When we stream HDR content to our TV, or the Xbox One S tries to send the signal, what it wants to know is whether or not the screen is okay to receive it. HDMI 2.0a (the port, not a cable) allows us to send over additional information to check this (similar to how DRM works, it will block you if it doesn't like the screen).
Back to the requirements for HDR Media Profile:
[/list]
1. aka the Perceptual Quantize, we need to be able to display Rec. 2020 -- which is the wider colour gamut. This alone doesn't mean more colours, it means it can pick out colours some this space. This also allowed the content to display luminance levels of up to 10,000 cd/m2 -- not that it does, it's just that it can (nothing does, literally no TV does this)
2. The 10-bit Monitor has this, so that's fine.
3. The 10-bit Monitor can have this also. No worries.
4. Now here's another issue. These 10-bit monitors are not going to know what to do with this. The port is either not HDMI 2.0a so they can't carry it, or it simply isn't created to verify it can accept the metadata (it can't).
And we also have the problem with a decoder, also. Nvidia's solution here is to use Nvidia Shield which has an HDR decoder. These monitors don't have this. It's specific software that's part of why the Xbox One S doesn't allow HDR content to it.
Also, I hope you're aware that there are 10-bit monitors with higher contrast ratios than TVs. It doesn't matter if they don't accept the signal. And if they don't accept the signal, they aren't displaying anything.
Early 2017 is when they should start to.
That's all fime, but speaking about hdr tvs, brightness is part of the standard. I don't understand why you say brightness has nothing to do with it, while every website you ask says it's essential.
Heres another from 4k.com
http://4k.com/high-dynamic-range-4k...-contrast-wide-color-gamut-tvs-content-offer/
"Currently, the “ideal” HDR standard that key players are pushing for would involve a dynamic range of 0 to 10,000 nits, which would really bring 4K TVs close to what real life looks like (the sky on a sunny day offers about 30,000 nits of brightness to the naked eye). However, in practical reality, even the latest HDR standards for premium 4K ultra HD TVs cover only 0.05 to 1100 nits
In very basic terms, HDR is the ability to expand the different stops of both bright and dark levels in a 4K TV for a wider, richer range of colors, much brighter, more realistic whites and much deeper, richer darks, all being manifested at the same time on the same display as needed. With this, a TV display takes on a more “dynamic” look and ultimately gives the content a viewer is looking at a far more vibrant and realistic appearance."
Is 4k.com wrong too? You told me to show you any source that says hdr has anything to do with brightness. I show you multiple sources and you just say theyre all bad and wrong.
No, because what you see on the right IS showing on your non-HDR screen. Literally EVERYTHING you see on a non-HDR display is capable without HDR...
What I mean is the way both have an impact on the camera lens, you can tell the SDR screen compromises brightness over detail.
What I mean is the way both have an impact on the camera lens, you can tell the SDR screen compromises brightness over detail.
Right, but my post you argued with in the first place was that the hdr screen allowed for a brighter image without sacrificing detail in bright areas due to the enhanced contrast level. You told me the brightness had nothing to do with the hdr screen, and that's false. This thread wasn't about hdr10, it's about an hdr display vs a non hdr display.
So should I believe a condescending neogaf poster, or the multitude of websites and my own eyes when personally viewing things on my own hdr tv?
The only difference between an HDR screen and a non HDR screen is the color of the primary illuminants (red, green and blue). There isn't any secret new technology being deployed in the production of the actual screens that exists in one, but not the other. I think the big difference that will be seen initially is that manufacturers are pushing these as their high end sets so there will be more money put into the screens of HDR displays for the short term and they will have that advantage.
All the comments bashing HDR based on completely illogical photo/graphic comparisons are majorly off the mark.
Go to a store and see it in action. It's truly amazing. The most impressive jump in display tech since I saw a plasma tv for the first time, way back when.
Right, but my post you argued with in the first place was that the hdr screen allowed for a brighter image without sacrificing detail in bright areas due to the enhanced contrast level. You told me the brightness had nothing to do with the hdr screen, and that's false. This thread wasn't about hdr10, it's about an hdr display vs a non hdr display.
So should I believe a condescending neogaf poster, or the multitude of websites and my own eyes when personally viewing things on my own hdr tv?
Ok, but I was specifically talking about the display posted in this thread. I would also assume that going forward, all tvs with a 1000: .05 contrast ratio would be 10 bit panels, and therefor hdr.
I'm not being condescendingly purposefully.
I'm saying that it has nothing to do with contrast ratio. The 10-bit colour is why you can get a brighter image with detail. HDR doesn't make screens brighter. And any picture posted, showcasing any differences, will never show any benefit of HDR -- because if there is detail in one screen or the other, it is achieveable on our SDR displays -- including the intense contrast ratio.
You asked why monitors aren't HDR. I explained why.
The write up for the BVM-X300 is my best explanation but it's extremely complicated (it is an HDR Monitor, but Professional, it's what they use to produce HDR content etc.)
That's all there is, really.
Sadly the PS4 pro won't be bringing the 4k part
Ok, but I was specifically talking about the display posted in this thread. I would also assume that going forward, all tvs with a 1000: .05 contrast ratio would be 10 bit panels, and therefor hdr.
I think the best way to explain HDR is with this example:
"Say you are taking a picture of a red wagon. Current TVs can choose from 256 shades of that red color to show all the highlights and shadows. HDR TVs can use 4096 shades of that color to show even more nuanced color variation. It is like having a larger box of crayons with which to draw the image."
How is that?
I thought ps4 pro can do native 4K?
As an owner of a 4k tv without HDR, I'm going to wait for OLED TVs to have better response times / for the ones with decent response times to drop in price by like 25-30%.
Im jelly over HDR, but have no issues with my current TV at all.
It can and does, people like to shitpost
I think the best way to explain HDR is with this example:
"Say you are taking a picture of a red wagon. Current TVs can choose from 256 shades of that red color to show all the highlights and shadows. HDR TVs can use 4096 shades of that color to show even more nuanced color variation. It is like having a larger box of crayons with which to draw the image."
How is that?
What does the "dynamic range" in hdr stand for then? Dynamic range of what? If not contrast and color, please explain.
Edit: im my personal experience, hdr content has brighter brights against fuller darks, and more detail in dark scenes. So how is brightness not a part of it?
It doesn't work like that. To attempt to make that shade of red with 8 bit color you have 255 shades of each of red, green and blue (0 is black, and with video standards you only get steps 16 - 235 to make that shade). Odds are it's impossible to make that shade of red though because it doesn't even exist in the REC 709 color gamut.
That's what HDR displays are bringing to the table. That shade of red might now fall in the REC 2020 gamut. Then you have more granularity (10 bit color) to try and hit that shade of red.
I guess I'm just confused I guess?It's marketing bullshit. High Dynamic Range (HDR) can mean whatever these TV companies want it to mean, but as far as Netflix, Xbox One S, UHD Blu-ray etc. are concerned, it means more colour.
They are mutually exclusive.
If you took that content and put it on an SDR display, just like those images in the link I sent you, the brights part and darkest parts will still have the same nits (if I am allowed to simplify), and you're that there just wouldn't be as much detail.
It's like this ~ picture a scene with a flame in the middle and a grey coat in the background.
HDR Screen with Good Contrast Ratio = You can see the red in the flame, and the grey coat is visible.
HDR Screen with Bad Contrast Ratio, but on high brightness = You can't see the red in the flame, and the grey coat is visible.
HDR Screen with Bad Contrast Ratio, but on lower brightness = You can see the red in the flame, and the grey cost is invisible.
---
SDR Screen with Good Contrast Ratio = You can't see the red in the flame, and the grey coat is visible.
SDR Screen with Bad Contrast Ratio = You can't see the red in the flame, and the grey cost is not visible.
---
I hope you can separate these...
You can make it with dithering =P
6-bit displays have been kicking ass for the longest time.
Assuming the signals are mapped to the same luminance range, this isn't right. If red is straight-up desaturated to yellow/white then that's a result of a luminance blow-out in all the color channels. A wider color spectrum won't fix this, because having a red enough red isn't the problem. What'll fix it is a better contrast ratio that can allow for really bright reds to be displayed.It's like this ~ picture a scene with a flame in the middle and a grey coat in the background.
HDR Screen with Good Contrast Ratio = You can see the red in the flame, and the grey coat is visible.
SDR Screen with Good Contrast Ratio = You can't see the red in the flame, and the grey coat is visible.
You can make it with dithering =P
6-bit displays have been kicking ass for the longest time.
Right. In REC 709 color space, dithering between (128,128,0) and (128,128,1) will give you an impression of an intermediate yellow tone that's basically (128,128,.5). But you can't use such methods to get a tone that's, say, more yellow than (128,128,0). You need to define a wider color space to do that, and that's what the REC 2020 color space is about.Unfortunately you can't hit any color that doesn't exist in the displays available color gamut. A display is only capable of producing colors that fall within it's triangle. There's no trick you can do to produce colors outside of this gamut. You are limited by whatever colors of red, green and blue the manufacturer produced the screen with, and the back light has to be capable of producing a light capable of illuminating those colors.
I get the feeling people who have been running their existing SD or even HDR displays in uncalibrated wide colour gamut + torch mode brightness + game mode (...perhaps the vast majority of gamers, especially), despite not having wide colour/HDR sources, are going to be the most confused when seeing HDR content.
Right. In REC 709 color space, dithering between (128,128,0) and (128,128,1) will give you an impression of an intermediate yellow tone that's basically (128,128,.5). But you can't use such methods to get a tone that's, say, more yellow than (128,128,0). You need to define a wider color space to do that, and that's what the REC 2020 color space is about.
Assuming the signals are mapped to the same luminance range, this isn't right. If red is straight-up desaturated to yellow/white then that's a result of a luminance blow-out in all the color channels. A wider color spectrum won't fix this, because having a red enough red isn't the problem. What'll fix it is a better contrast ratio that can allow for really bright reds to be displayed.
There are already hdr10 displays with poor contrast ratios, and hdr doesn't look impressive on them. That's why a high dynamic range of contrast is necessary.I feel like I included this in my example. But of course it's an oversimplification. We will, for sure, get HDR10 displays that have terrible contrast ratios unless the industry suddenly no longer cares about making trash displays.
There are already hdr10 displays with poor contrast ratios, and hdr doesn't look impressive on them. That's why a high dynamic range of contrast is necessary.
Source:. I owned one before upgrading to my current display.
...
Omg I'm being trolled
No?
Edit:
I owned the sony 850d. It gets super bright and has wide color gamut, unfortunately the ips panel doesn't get dark enough, so instead of popping, hdr content looks washed out.
I now own the ks8000 and hdr content is amazing and all the bright highlights and silhouettes in hdr content are mesmerizing.
I believe you. So why were you saying that HDR has anything to do with contrast ratio? You know yourself that HDR content alone doesn't offer what you're asking.
High dynamic range of contrast ratio is a term you have made up, just now.
Again, what you're looking for is UHD Premium (which dictates contrast ratio, and is merely a standard), HDR honestly just refers to PQ and 10bit colour (and metadata).
I've been repeating that in every single post since the very beginning, and all this time you had an HDR display that didn't do what you were saying HDR WAS doing...
Just... nevermind. Really I don't win anything here, I'll accept that I'm wrong if it means the end.
Whenever I try playing back HDR footage on my computer on my HDR projector it looks washed out. I wish this wasn't a pain in the ass to setup and shit just worked >_> No idea what's wrong.
Also have no idea how to setup my shield to output HDR to my HDR TV. Settings grayed out at 4:2:0 or whatever *sigh*
You said that an HDR display with good contrast would be able to see red in a flame whose colors are blown out on an SDR display with good contrast.I feel like I included this in my example.
HDR is like Digital Vibrance Control by NVIDIA but not but kind of.
HDR is like Digital Vibrance Control by NVIDIA but not but kind of.
You said that an HDR display with good contrast would be able to see red in a flame whose colors are blown out on an SDR display with good contrast.
If the lack of redness in the SDR display is due to all color channels being blasted high to achieve preserve brightness, you'd have basically the same issue with an HDR display of comparably good contrast.
Where HDR would really help a lot is when the lack of redness is due to a display not being able to display a red enough red.
What a blown-out flame really needs is more contrast.
HDR is like Digital Vibrance Control by NVIDIA but not but kind of.
You're blending brightness and saturation.I'm assuming a very bright red that belongs outside of Rec 709. It was a simplified example.
See, here I don't really like. Already felt this in the Horizon demo but it seems like a blue and orange enhancer.
Night and day difference.
damn it, late on the joke
You're blending brightness and saturation.
If it's simply a matter of a red being outside of REC 709, and the similar-contrast HDR display could handle it correctly, an SDR display would still be able to display it pretty red; just not as red.
I mean this is a relatively pointless hypothetical that doesn't exist in the real world. I just meant to try and separate contrast and wider colour gamut.
Regardless, I feel like there is enough information here.
It's pretty exciting to finally be upgrading from 8-bit... We should never go back.
The phrase was originally applied to this context in the sense of high contrast.I guess where I get confused is, if you can have a non wide color gamut display with hdr, and you can also have hdr that is mapped to rec 709, then what is hdr if not the combination of color and contrast?