• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

HDR comparison shots

TSM

Member
It's fine to care about technical stuff, but I don't think OP was trying to sell this as a 100 percent example, just showing some of the contrast differences which do come accross here.

And if you just care about technical aspects, why did you jump to some crazy conspiracy theory that a user on neogaf is posting pictures of a 2015 samsung model to try to get you to buy a new tv?

I'm not? I was more talking about the split screen images from different sources with an agenda. The OP didn't provide enough info about his process for his images to matter.
 
The easiest way to describe HDR is that video footage, especially if at a high framerate, for the first time ever, actually looks like real life.

Every scene is like "Oh, that's what it actually looks like outdoors." and, with wide color gamut, "Oh, that's actually red!" (which seems like a weird thing to say, but that's what it's like. You realize you hadn't ever seen colors correctly before on a screen).

I think the coolest part of HDR, and the color graders looking for a wow-effect on early films know this, is how for the first time, fire is correctly rendered. With good HDR, you can see every variation of color and detail in a fire, regardless of ambient lighting conditions, rather than just a reddish outline with white in the middle because of clipping at a single exposure.

Good HDR doesn't actually look like any of those demo images, where the contrast is simply cranked up. It looks natural. That said, there's a lot of bad HDR out there that oversaturates colors, clips on the bright or dark ends, or goes crazy with color filters to ruin the effect.
 

MazeHaze

Banned
The easiest way to describe HDR is that video footage, especially if at a high framerate, for the first time ever, actually looks like real life.

Every scene is like "Oh, that's what it actually looks like outdoors." and, with wide color gamut, "Oh, that's actually red!" (which seems like a weird thing to say, but that's what it's like. You realize you hadn't ever seen colors correctly before on a screen).

I think the coolest part of HDR, and the color graders looking for a wow-effect on early films know this, is how for the first time, fire is correctly rendered. With good HDR, you can see every variation of color and detail in a fire, regardless of ambient lighting conditions, rather than just a reddish outline with white in the middle because of clipping at a single exposure.
This is a great description! I'm still blown away by simple things like candles in a dark scene. The way the screen stays dark but the candle is so bright, I think "wow, just like a real candle!" And the way light flickers on peoples faces and objects accurately is amazing.
 

TSM

Member
And then you said op has an agenda to sell displays lol.

No there was a misunderstanding. I should have been more clear. I'm talking about the bogus artist created split screen images.

The OP is just a mess with no technical explanation for anything he did. Were the 2 displays even calibrated? I'd imagine not.
 

MazeHaze

Banned
No there was a misunderstanding. I should have been more clear. I'm talking about the bogus artist created split screen images.

The OP is just a mess with no technical explanation for anything he did. Were the 2 displays even calibrated? I'd imagine not.
Ah gotcha. I agree this example isn't technically sound, but it does illustrate some of the differences I see when I do the same comparisons in person on my own diaplays, which I think is all the op was going for.
 

YBdisk

Member
Why don't any of you just try to explain HDR with an example that most people here (gamers) should already be familiar with? Like current gen consoles (Wii U am cry) Limited RGB vs Full RGB option. Go ahead and adjust the settings on you console and tv from limited to full and you'll see the difference between them (unless you have a really old tv without full rgb). Now think of HDR as the next available option.

I don't have a PS4 or XBONE, so if this doesn't work forget i said anything ;).
 

Izuna

Banned
I've decided to cross post if that's okay. I feel like the thread title here brings different people in looking for the same info.

http://www.neogaf.com/forum/showthread.php?t=1276662

WHAT IS HDR?

HDR10 is an open standard in the industry. It has an odd, hard to remember name. That’s why you probably won’t see “HDR10” listed on many specification sheets or boxes. The TV will simply say it supports “HDR” and you’ll have to assume it supports HDR10 content. ~How-To Geek
  • 10-bit Colour
    10-bit Colour, 2^30 gives a total of 1,073,741,824 (1.07b) colours
    8-bit Colour, 2^24, gives a total of 16,777,216 (16.8m) colours
  • Rec. 2020
    This means you will have available colours much closer to the visible spectrum of our eyes. While HDR Media Profile doesn't specify how much of this should be covered, a UHD Premium label on a display means it has at least 90% of DCI P3...
    hY2lpzh.jpg

    Screens of today use Rec.709 (much smaller) - this, by the way, is why it is important for us not to try and view HDR images directly on our non-HDR screens. The colours will all fall in the wrong place.


SO, WHAT DO WE SEE?

colour_infographic.jpg


On a conventional display, what the game is actually being rendered in is being reduced to fit Rec. 709, so we are losing the ablity to show millions of natural colours that we see in real life. With Rec. 2020 (which is what HDR maps to) we are able to display them.

1.jpg


A good example is to look at a picture of red flame. It would have really bright reds, but in Rec. 709 that red isn't available. So what do we do? At the moment, that red is mixed with blues and greens to make it as bright as we want it. This is the technique we have had to deal with up until HDR.

On an HDR screen (and if this image itself was made for Rec. 2020) the colour of that flame would be closer to how it is in real life.


WHAT'S THE DEAL WITH THIS TALK OF MORE BRIGHTNESS?

This is a hard one. I won't embed them on this page, but throughout GAF you see these horribly innacurate SDR (normal) vs. HDR images where the HDR side is simply brighter and higher saturated. While these images are false, they're not misleading.

This is the answer I have decided to go with:

UHD-Premium-logo-1.jpg


The above logo refers to a specification that may allow a TV to be advertised as UHD Premium. The requirements are:
  • At least 3,840 x 2,160 (4K)
  • 10-bit Colour
  • At least 90% of DCI P3
  • More than 1000 nits (peak brightness) and less than 0.05 nits black level or,
  • More than 540 nits (peak brightness) and less than 0.0005 nits black level (since you don't need high brightness if your blacks are so good, like an OLED)

What's happening is that while HDR10 is an open platform, and contrast ratio requirements aren't part of the specifications (there are none), TV companies are sort of pretending it DOES have minimum specifications -- and that all TVs prior to HDR10 are garbage with low brightnesses and horrible blacks. A lot of the comparisons you are looking at are UHD Premium compliant TVs that are showing their HDR compatibility along with their contrast ratio standards.

Does that mean a non-UHD Premium TV can't do HDR? No. It's just really unlikely that you'll get a screen sporting HDR that will have terrible contrast ratio, although it is certainly possible.

Some additional perspective:
...Philips, too, is sticking to HDR-only branding, although none of its tellys meet the UHD Premium specification anyway; its top-tier ‘HDR Premium’ models only go up to 700nits, with its ‘HDR Plus’ sets coming in around 400nits. ~ WHAT HI*FI


A COMPARISON BETWEEN 8-BIT AND LOWER?

If this looks like a minor difference etc. sure. You could write an entire paper with regard to how we process the images we see. Essentially, we're pretty good at putting colours next to each other to make it appear like another:

f2N356N.png


I used Error Diffision simply to prove a point. Even with just 256 total colours (from 16.8 million), we have come a long way with compression techniques and can do stuff with a limited space so that you wouldn't see much of a difference. You don't need to use all these colours at the same time, it's more about what colours you can use. If you get a scene that needs lots of shades of the same colour, that's where you need more colour depth:

M7Gw0ha.png


Here I used Nearest to prove a point (I'm cheating). Anyway, you can see in this image that there is no way for the limited number of colours to show the BFF-chan's face without some serious issues. We only have 256 colours available.

These comparison are completely cheating anyway. If you had a display that could only produce 256 colours, it doesn't mean at once, it means overall. That means our 256 colour display can't produce both images. What you're more likely to get is this:

yHWCLnC.png


So with HDR10, there are some scenes we can show that we currently don't even think of showing. Think back to the picture of the fire, can you see the individual bright reds in the middle of the flame? No, because that information isn't there and our SDR screens couldn't display it anyway. So you have to understand, these images won't even show you anything new on your HDR display because they were shot in 8-bit.

IS IT LIKE LIMITED VS. FULL COLOUR SPACE?

Yes... But the differences are FAR greater for what we call DEEP COLOR:
  • LIMITED : < 8/24-bit : (17 - 235) = 218*218*218 = 10,360,232 (10.3m) Colours
  • FULL : = 8/24-bit : (0 - 255) = 256*256*256 = 16,777,216 (16.8m) Colours
  • DEEP : 10/30-bit : (0 - 1023) = 1024*1024*1024 = 1,073,741,824 (1.07b) Colours


8-BIT? 24-BIT? 10-BIT? 30-BIT?

ikr? 8-bits means 256 shades per colour. Since we use red, green and blue, we sometimes call this 24-bit (8*3) True Colour. 10-bit is 1024 shades etc.


IS HDR10 JUST 10-BIT COLOUR THEN?

No, HDR Media Profile is other stuff (in super simple terms):
  • EOTF: SMPTE ST 2084, or Perceptual Quantizer (PQ), lets us use Rec. 2020 and much higher luminance.
  • METADATA: SMPTE ST 2086, MaxFALL, MaxCLL - This allows the device to tell the screen what to do with the image. This is why your 10-bit monitors can't display HDR content etc.


I WANT THE UNSIMPLIFIED EXPLANATION
--
 

eso76

Member
?

The thing that doesnt come accross in these pictures is that the bright highlights (like the sun) in the hdr shot are like 3 times brighter than the sun in the sdr shot. Yet, in the sdr shot the much dimmer sun washes out the entire shot, where with hdr a high contrast level is still maintained.

Yes, that's more or less what i said.
HDR allows for much more brightness (and..shrug..darkness) to coexist on screen at the same time.
I haven't seen a proper set with my eyes yet, i can only imagine what it is like from knowing very well the difference there is between what my eyes can see and what my DSLR captures.

Like i said in another thread, HDR monitors should be a game changer for digital photography as well.
 

MazeHaze

Banned
Why don't any of you just try to explain HDR with an example that most people here (gamers) should already be familiar with? Like current gen consoles (Wii U am cry) Limited RGB vs Full RGB option. Go ahead and adjust the settings on you console and tv from limited to full and you'll see the difference between them (unless you have a really old tv without full rgb). Now think of HDR as the next available option.

I don't have a PS4 or XBONE, so if this doesn't work forget i said anything ;).
Because hdr is way beyond that example, and hard to describe. The post a few posts up that talks about "whoah! Thats actually red!" Is a very good description.
 

Vipu

Banned
Is everyone trying to explain this in too hard way?

Isnt it same kind of difference if you go to windows, display settings and change color settings step lower so there is less colors?

This way you can see that difference with your current screen.
 

MazeHaze

Banned
Is everyone trying to explain this in too hard way?

Isnt it same kind of difference if you go to windows, display settings and change color settings step lower so there is less colors?

This way you can see that difference with your current screen.
No because hdr isnt about color really, color is part of it. Hdr is more about contrast and the displays ability to get insanely bright (enough that you kind of squint) while maintaining deep black levels. Like in real life when you see a very bright sun peaking thru the trees.
 
It's fine to care about technical stuff, but I don't think OP was trying to sell this as a 100 percent example, just showing some of the contrast differences which do come accross here.

And if you just care about technical aspects, why did you jump to some crazy conspiracy theory that a user on neogaf is posting pictures of a 2015 samsung model to try to get you to buy a new tv?

It's ridiculous, I posted these in good faith and with the fairest settings I could manage, and that's what I've been saying all along, it's just an example of some benefits you can get with HDR, NOT the full deal, obviously phones and whatever are not going to display full on HDR content with a non HDR picture, but those were what I saw when taking the pics (WITHOUT THE BRIGHTNESS IMPACT) in person, which makes a big difference btw, and I checked each one against the TV screens to double check, and the details and colour clipping on show at the time of viewing looked the same on the phone screen (and non HDR mode on the JS screen).....

So the Sony looks blue......that's just how it is with calibrated settings. Sometimes your eyes can be good enough for picking out certain details and getting into the science claiming that it's impossible to convey HDR when I'm actually not saying that isn't necessary.
 

SebastianM

Member
In the last picture, the HDR screen comes off looking as a regular real life photo, whereas with the 1080p tv you can tell that its a photo of an actual display, with all of the artifacts created by the camera lens.

At least there is a noticeable difference that we are able to appreciate with our non-hdr screens right there.
 

Izuna

Banned
Is everyone trying to explain this in too hard way?

Isnt it same kind of difference if you go to windows, display settings and change color settings step lower so there is less colors?

This way you can see that difference with your current screen.

Kinda.

When you do this, you see some banding issues because content is made for True Colour displays. We don't get banding because we're not silly enough to try and reproduce 10-bit screens on our conventional displays.
 

Izuna

Banned
In the last picture, the HDR screen comes off looking as a regular real life photo, whereas with the 1080p tv you can tell that its a photo of an actual display, with all of the artifacts created by the camera lens.

At least there is a noticeable difference that we are able to appreciate with our non-hdr screens right there.

No, because what you see on the right IS showing on your non-HDR screen. Literally EVERYTHING you see on a non-HDR display is capable without HDR...
 

MazeHaze

Banned
In the last picture, the HDR screen comes off looking as a regular real life photo, whereas with the 1080p tv you can tell that its a photo of an actual display, with all of the artifacts created by the camera lens.

At least there is a noticeable difference that we are able to appreciate with our non-hdr screens right there.
Yep, youll notice the lights in yhe backround near the ceiling in the sdr display wash out the detail because an sdr cant get super bright while maintaining darker detail, yet in the hdr pic you can clearly see the grid of the light fixtures, even though the hdr image is substantially brighter.
 

MazeHaze

Banned
No, because what you see on the right IS showing on your non-HDR screen. Literally EVERYTHING you see on a non-HDR display is capable without HDR...
This isn't entirely accurate though. If you look at those two screens in real life you'll notice the same difference in detail of bright areas of the screen as you do in these photos.

Source:. I did this comparison myself, same footage.

Edit if anyone else wants to compare on their hdr screens, this is chefs table on netflix.
 
HDR is kinda like VR in that you simply can't explain why it's better, somebody has to view it in person to "get it".

So far, my only experience with HDR is through my 2015 Sony TV that got HDR enabled through an update late last year. It doesn't meet the specs required to be officially HDR certified and my only experience with HDR so far is through Netflix streaming. So maybe my opinion is still invalid, but I can tell a difference with HDR enabled, but it doesn't blow my socks off. Yes, it looks better, but so far I don't get the over-the-top experience that others seem to get.
 

MazeHaze

Banned
HDR is kinda like VR in that you simply can't explain why it's better, somebody has to view it in person to "get it".

So far, my only experience with HDR is through my 2015 Sony TV that got HDR enabled through an update late last year. It doesn't meet the specs required to be officially HDR certified and my only experience with HDR so far is through Netflix streaming. So maybe my opinion is still invalid, but I can tell a difference with HDR enabled, but it doesn't blow my socks off. Yes, it looks better, but so far I don't get the over-the-top experience that others seem to get.
The 2015 sonys dont get bright enough to truly take advantage, thats why they aren't hdr certified. I dont understand how the 850d this year got certified either, it doesn't have near enough native contrast to show the benefits of hdr. That one slipped through somehow though.

And the source material here isn't mind blowing hdr either. Actually the only thing on netflix that makes me say "wow!" Is cosmos laundromat, and as much as I hate to admit ot because the movie sucks, certain parts of ridiculous 6.
 

Izuna

Banned
This isn't entirely accurate though. If you look at those two screens in real life you'll notice the same difference in detail of bright areas of the screen as you do in these photos.

Source:. I did this comparison myself, same footage.

smh, no. There is no image you can show me that I can't do without HDR because I'm looking at this thread on an 8-bit screen. So are you, most likely (and even if you weren't, those images are 8-bit).

HDR isn't making the image brighter.

Do the comparison with this:

7x7.png


You'll get whatever brightness difference due to the screens having different luminosity, this has nothing to do with HDR.

Or do you think that such an image is impossible to look at on a non-HDR screen? Or do you think my blacklevel is better if I crop the image and only show black?

Any, and I really mean this, ANY detail you're talking about cannot be demonstrated in any image, to anyone not using HDR. All these comparisons are doing is making people think it's attainable with fiddling with their settings.
 

MazeHaze

Banned
smh, no. There is no image you can show me that I can't do without HDR because I'm looking at this thread on an 8-bit screen. So are you, most likely (and even if you weren't, those images are 8-bit).

HDR isn't making the image brighter.

Do the comparison with this:

7x7.png


You'll get whatever brightness difference due to the screens having different luminosity, this has nothing to do with HDR.

Or do you think that such an image is impossible to look at on a non-HDR screen? Or do you think my blacklevel is better if I crop the image and only show black?
That's not what I'm saying. I'm saying the luminosity of the hdr screen is much higher, the bright areas are much brighter, yet it still doesn't wash out the detail of those light fixtures the way the much dimmer sdr set does.
 

Izuna

Banned
That's not what I'm saying. I'm saying the luminosity of the hdr screen is much higher, the bright areas are much brighter, yet it still doesn't wash out the detail of those light fixtures the way the much dimmer sdr set does.

Fair enough, but you understand that this is just about colour in this sense. You can have the luminosity without the additional colours.

That extra detail is caused by the bright colours being impossible to show on an SDR set, or at the very least, at the same time many other colours at once.

Nothing to do with "more contrast". And you can't show those colours on an SDR display.
 

MazeHaze

Banned
Fair enough, but you understand that this is just about colour in this sense. You can have the luminosity without the additional colours.

That extra detail is caused by the bright colours being impossible to show on an SDR set, or at the very least, at the same time many other colours at once.

Nothing to do with "more contrast". And you can't show those colours on an SDR display.
HDR isnt just about color though, it's about contrast. Peak brightness vs. deep blacks, being displayed together at the same time. This aspect comes accross in these photos and that's what I am pointing out. The only thing about the contrast that doesn't come accross is that the light highlights are extremely bright. The detail is still observable.

Hdr is contrast
10 bit or wide color gamut are color
Hdr standard is the combination of both.

You can still have hdr even in black and white, hdr and wide color gamut juat combine to make an outstanding picture and that is the hdr standard.
 

Izuna

Banned
HDR isnt just about color though, it's about contrast. Peak brightness vs. deep blacks, being displayed together at the same time. This aspect comes accross in these photos and that's what I am pointing out. The only thing about the contrast that doesn't come accross is that the light highlights are extremely bright. The detail is still observable.

Hdr is contrast
10 bit or wide color gamut are color
Hdr standard is the combination of both.

You can still have hdr even in black and white, hdr and wide color gamut juat combine to make an outstanding picture and that is the hdr standard.

Where are you getting this from? There is literally no specification for contrast ratio for HDR. It's simply SMPTE 2084, 10-bit colour and metadata.

Are you getting confused with the UHD Premium standard which dictates minimum contrast ratio for TVs?
 
I think my comparison shows how the TV uses the metadata more than anything, regardless of the HDR wow factor, it just isn't difficult to see the areas that are improved between HDR and a standard screen, before we get into the 'this photo does not show HDR debacle'
 

MazeHaze

Banned
Where are you getting this from? There is literally no specification for contrast ratio for HDR. It's simply SMPTE 2084, 10-bit colour and metadata.

Are you getting confused with the UHD Premium standard which dictates minimum contrast ratio for TVs?
Hdr isn't just 10 bit color though, there are loads of 10 bit monitors, but none of them are hdr. Just like there are 8 bit hdr tvs, though they arent nearly as impressive obviously. Hdr is peak brightness without compromising black levels.
 

Izuna

Banned
I think my comparison shows how the TV uses the metadata more than anything, regardless of the HDR wow factor, it just isn't difficult to see the areas that are improved between HDR and a standard screen, before we get into the 'this photo does not show HDR debacle'

Can you do us a favour and take the same side-by-side of non-HDR content, without changing any settings?

I think you'll see the exact same thing.

Hdr isn't just 10 bit color though, there are loads of 10 bit monitors, but none of them are hdr. Just like there are 8 bit hdr tvs, though they arent nearly as impressive obviously. Hdr is peak brightness without compromising black levels.

Can you do me a favour, and with an open mind, read my post?

Or you could give me something to read that states, anywhere, that HDR has anything to do with brightness...
 

Syril

Member
What TV was it though? There is a pretty wide range of performance in HDR accross different sets. Also those sets in Best Buy aren't really calibrated, better to check out HDR on all the displays rather than rely on one to show you. The LG OLEDs will knock your socks off. The KS8000 hdr reel is good too. The vizio one doesn't really show it off well, and the sonys dont either.
I don't remember exactly what TV it was on, but it was the only HDR demonstration in the entire store that wasn't a brazenly disingenous splitscreen comparison where the non-HDR side was set way duller than what non-HDR screens actually look like. There were a lot of TVs showing HDR reels there (and the Best Buy from yesterday wasn't the first time I've seen HDR demonstrations at stores before either), and they all looked nice, so I'm sure one of them was "knock your socks off" good. Seriously, what am I supposed to do to see what HDR is supposed to look like? I get that store displays may or may not be set up properly, but it would be a bit of a stretch on my part to assume that every one of them isn't. It could just be that I've been seeing them "properly" the entire time, and it's just not as mind-blowing to me as it is to others. I mean, I've always understood HDR to mean better contrast and range of color, so it doesn't exactly blow my mind to see exactly what I expect.
 

MazeHaze

Banned
Can you do us a favour and take the same side-by-side of non-HDR content, without changing any settings?

I think you'll see the exact same thing.



Can you do me a favour, and with an open mind, read my post?

Or you could give me something to read that states, anywhere, that HDR has anything to do with brightness...
http://www.trustedreviews.com/opinions/hdr-tv-high-dynamic-television-explained

Youll find plenty of other sources if you do research yourself.

"There are two components to consider here. One is peak brightness, which rather unsurprisingly, refers to how bright a TV can go, measured in what’s known as nits. TVs must meet a specific target of nits in order to be given the HDR label.

The other measurement is black level. Similar to peak brightness, black level refers to how dark a TV image can appear and is also measured in nits. So, for example, a TV could have a peak brightness of 400 nits and a black level of 0.4 nits.

The difference between the peak brightness and black level is known as the contrast ratio. HDR TVs have to meet specific standards for peak brightness and black level which helps give them the dynamic appearance.
Read more at http://trustedreviews.com/opinions/hdr-tv-high-dynamic-television-explained#iqk8drqAhYOvp7AQ.99"

I believe the contrast ratio for hdr certification is 1000 nits brightness to .5 black level. Contrast ratio and dynamic range literally mean the same thing.
 

Izuna

Banned

Wrong.

ST2084 HDR - WHAT DOES IT REALLY MEAN?

The biggest confusion with regard to ST2084 HDR is that it is not attempting to make the whole image brighter, which unfortunately seems to be the way most people think of HDR, but aim to provide additional brightness headroom for spectral highlight detail - such as chrome reflections, sun illuminated clouds, fire, explosions, lamp bulb filaments, etc. - http://www.lightillusion.com/uhdtv.html

That minimum nits etc. if for this:

2437560_HOA_037_UHDA_LOGO_TM_FINAL-640x192.jpg


NOT HDR10. Thank you for confirming you didn't read my post.

HDR Media Profile literally has no specification, there is no "LABEL" requirements. The requirements (to actually USE HDR content, not for the label) are:

* Note: HDR10 Media Profile is defined as:
EOTF: SMPTE ST 2084
Color Sub-sampling: 4:2:0 (for compressed video sources)
Bit Depth: 10 bit
Color Primaries: ITU-R BT.2020
Metadata: SMPTE ST 2086, MaxFALL, MaxCLL
- https://www.cta.tech/News/Press-Releases/2015/August/CEA-Defines-‘HDR-Compatible’-Displays.aspx

That's by the CEA.

EDIT: I just realised that I used the SAME source as you to produce my other post, which very clearly states that it is the UHD alliance that has those requirements. You didn't even read your own source.
 

MazeHaze

Banned
Wrong.



That minimum nits etc. if for this:

2437560_HOA_037_UHDA_LOGO_TM_FINAL-640x192.jpg


NOT HDR10. Thank you for confirming you didn't read my post.

HDR Media Profile literally has no specification, there is no "LABEL" requirements. The requirements (to actually USE HDR content, not for the label) are:

- https://www.cta.tech/News/Press-Releases/2015/August/CEA-Defines-‘HDR-Compatible’-Displays.aspx

That's by the CEA.

EDIT: I just realised that I used the SAME source as you to produce my other post, which very clearly states that it is the UHD alliance that has those requirements. You didn't even read your own source.
And you didn't read my post.
The uhd alliance gives the hdr label, and they only give it to tvs with a dynamic range of 1000 : .5


Edit:. The definition for the term dynamic range is the difference between smallest amd largest. In this case, hdr as we speak of it refers to the contrast ratio.
 

Izuna

Banned
And you didn't read my post.
The uhd alliance gives the hdr label, and they only give it to tvs with a dynamic range of 10000 : .5

No, they don't. HDR10 is an open platform.

If you read my post, you would see where I mentioned this:

But just because a TV isn&#8217;t stickered, that doesn&#8217;t mean it doesn&#8217;t fit the bill - and here lies the problem.

Despite Sony&#8217;s flagship 2016 XD93 and XD94 TVs easily meeting the required UHD Premium specification, they don't use the UHD Premium label. Sony, despite sitting at the UHDA table, has gone rogue, opting to use its own does-what-it-says-on-the-tin '4K HDR' logo instead, which it plans to apply to products that don&#8217;t necessarily meet the stringent UHD Premium spec.

Philips, too, is sticking to HDR-only branding, although none of its tellys meet the UHD Premium specification anyway; its top-tier &#8216;HDR Premium&#8217; models only go up to 700nits, with its &#8216;HDR Plus&#8217; sets coming in around 400nits.

Read more at http://www.whathifi.com/advice/ultr...pecs-which-tvs-support-it#l3QuWflMqWvjYfeq.99

--

Anyway, I don't know why we are going to and from these tech-blogs, considering I sent you a link to the cta website showing that HDR has NO requirements.

And even so, I'm not even sure what you're arguing anymore. If you want to say it's just a label for contrast ratio, then you can't say that HDR mode on any TV is making the difference. You can get the same contrast ratio (and you always could) without an HDR television.

http://www.lightillusion.com/uhdtv.html

Take your time to read the truth.

EDIT:

nvm lmao, you decided to define your own meaning to HDR, k whatevs

PS4 Pro, Xbox One S HDR, and Netflix HDR don't have anything to do with contrast ratio. Deal?

The best image comparison I could find is from that very website.

sdr_simulation_normalised.jpg

SDR
hdr_simulation_normalised.jpg

HDR

Unfortunately, most ST2084 HDR demonstrations do not map the contrast range correctly, with the result that the overall image is simply much, much brighter, which is not the main intent of ST2084 HDR, as shown above.

Obviously, in the real world the extra dynamic range available with HDR would be used to re-grade the image creatively to benefit from the additional dynamic range - but extended highlight detail is the true reality of ST2084 HDR.
 

MazeHaze

Banned
No, they don't. HDR10 is an open platform.

If you read my post, you would see where I mentioned this:



--

Anyway, I don't know why we are going to and from these tech-blogs, considering I sent you a link to the cta website showing that HDR has NO requirements.

And even so, I'm not even sure what you're arguing anymore. If you want to say it's just a label for contrast ratio, then you can't say that HDR mode on any TV is making the difference. You can get the same contrast ratio (and you always could) without an HDR television.

http://www.lightillusion.com/uhdtv.html

Take your time to read the truth.

EDIT:

nvm lmao, you decided to define your own meaning to HDR, k whatevs

PS4 Pro, Xbox One S HDR, and Netflix HDR don't have anything to do with contrast ratio. Deal?

"HDR and HLG are not just about brighter displays, they about using the greater available display brightness to enable extended detail within the brighter highlights."

Thats from your own source. Hdr literally means extended contrast ratio. Thats what its for.
 

Izuna

Banned
"HDR and HLG are not just about brighter displays, they about using the greater available display brightness to enable extended detail within the brighter highlights."

Thats from your own source. Hdr literally means extended contrast ratio. Thats what its for.

I'm going to get myself banned if I keep this up.

I sincerely hope you learn to read.

They're talking about the additional colours (that create detail) you are able to have with Rec. 2020 and 10-bit colour.
 

MazeHaze

Banned
"What is high dynamic range?

The two most important factors in how a TV looks are contrast ratio, or how bright and dark the TV can get, and color accuracy, which is basically how closely colors on the screen resemble real life (or whatever palette the director intends).

HDR expands the range of both contrast and color significantly. Bright parts of the image can get much brighter, so the image seems to have more "depth." Colors get expanded to show more bright blues, greens, reds and everything in between."

http://www.cnet.com/news/what-is-hdr-for-tvs-and-why-should-you-care/
 

Izuna

Banned
I'm going to try my best to make this get to your head. If not, don't bother replying anymore because I've typed everything I can.

"What is high dynamic range?

The two most important factors in how a TV looks are contrast ratio, or how bright and dark the TV can get, and color accuracy, which is basically how closely colors on the screen resemble real life (or whatever palette the director intends).

Everything above this line, written by CNET means nothing. They're not trying to define HDR yet, so you agree to disregard this part right? I'm honestly not sure why you included it.

Bright parts of the image can get much brighter, so the image seems to have more "depth."

Half-true. Do you understand why? HDR isn't increasing the luminance of the screen. A screen can be how bright it wants regardless of 10-bit colour, HDR10 etc. What they are oversimplifying here, is that BECAUSE of:

Colors get expanded to show more bright blues, greens, reds and everything in between."

We keep bright objects, images etc. as bright and allow them to retain their actual colours. Someone earlier in the other thread said a good example of a purple sequin dress reflecting a bright light. On SDR, that light was just as bright, but it was white. On HDR, it was purple, just like the dress, while still being the same brightness.

I'm very disgusted with how you will go from site to site, CNET of all places, to dispute a point I have made by quoting a CEA definition. Regardless of what "official" site we want to agree with, you are grossly misunderstanding what they're saying.

First you talked about how HDR gives us a specification for the UHD Premium values. Then you said that a TV can't say HDR unless it did. Then you said that HDR literally is contrast (and nothing to do about colours). Then you said it is about higher brightness. And now you just quoted CNET who are doing their damnedest as explaining to a layman of the benefits of more brighter colours.

I know it's not proper etiquette to tell people if you ignore them, but I must. I can't be disproving every post you make while you move the goalpost or try to invent semantics.

I only hope that those lurking at least have more stuff to read and can try to understand both of our points of view, regardless of who they agree with.
 

MazeHaze

Banned
This is just ridiculous. The fact is, hdr when speaking of display technology refers to an increased contrast ratio (brightness) and an expanded color palette. Hdr10 or dolby vision don't make a display brighter, if you tried to play hdr10 content on an sdr tv you would see dull washed out colors and a poor contrast ratio. An hdr tv has a higher peak brightness and a lower black level than a standard definition tv. Do you really not know this?


Edit: standard dynamic range I mean.

If it has nothing to do with brightness, how come 10 bit computer monitors aren't considered hdr?
 
Now changing the spectrum (pun intend), for vídeos (h264,h265,vp9) does HDR means larger files? (I'd bet so...)

I found my answer

"Let’s take a high-level look at how HDR is implemented to help frame the related technical and distribute issues. Briefly, the increased contrast delivered by HDR, along with the expansion of the color gamut from Rec 709 to Rec 2020, means that each compressed pixel in a stream will require much more data to define. Some of that data will be included with the compressed video, and some will be in an accompanying metadata file, which according to some estimates will add about 20% in file size."

Source
 

Izuna

Banned
If it has nothing to do with brightness, how come 10 bit computer monitors aren't considered hdr?

ijcagCv.png

This image is taken from a slide from nVidia

Look at where it shows HDR Decoder. When we stream HDR content to our TV, or the Xbox One S tries to send the signal, what it wants to know is whether or not the screen is okay to receive it. HDMI 2.0a (the port, not a cable) allows us to send over additional information to check this (similar to how DRM works, it will block you if it doesn't like the screen).

Back to the requirements for HDR Media Profile:
  1. EOTF: SMPTE ST 2084
  2. Bit Depth: 10 bit
  3. Color Primaries: ITU-R BT.2020
  4. Metadata: SMPTE ST 2086, MaxFALL, MaxCLL

1. aka the Perceptual Quantize, we need to be able to display Rec. 2020 -- which is the wider colour gamut. This alone doesn't mean more colours, it means it can pick out colours some this space. This also allowed the content to display luminance levels of up to 10,000 cd/m2 -- not that it does, it's just that it can (nothing does, literally no TV does this)

2. The 10-bit Monitor has this, so that's fine.

3. The 10-bit Monitor can have this also. No worries.

4. Now here's another issue. These 10-bit monitors are not going to know what to do with this. The port is either not HDMI 2.0a so they can't carry it, or it simply isn't created to verify it can accept the metadata (it can't).

And we also have the problem with a decoder, also. Nvidia's solution here is to use Nvidia Shield which has an HDR decoder. These monitors don't have this. It's specific software that's part of why the Xbox One S doesn't allow HDR content to it.

Also, I hope you're aware that there are 10-bit monitors with higher contrast ratios than TVs. It doesn't matter if they don't accept the signal. And if they don't accept the signal, they aren't displaying anything.

Early 2017 is when they should start to.

Netflix, Xbox One S, PS4 Pro etc. when they want to send this beautiful Deep Colour image to us, our receiver needs to be HDR Media Profile compliant. It's sort of like a screen needing the G-Sync board inside before you're able to turn on the feature at the OS level (or why I can't force my PC to push 144hz to my Surface Book's screen).

Every point you made refers to Ultra HD Premium, which is the standard which basically says: "your TV must do HDR, 4K, and not be shit".
 
Top Bottom