• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

So what IS the HDR standard?

This is all so frustratingly confusing.

I a samsung UN55KU630DF

I didn't really know anything about HDR yesterday, so I researched yesterday and it seems like my TV doesn't have the "wide color gamut" necessary for HDR.

But on my box it says this:



So after reading this thread, which is great btw, thank you, I don't know what I have. :(

You can have hdr 8 bit panels even more confusingly. Even more confusing is some 8 bit panels are better at hdr than some 10 bit ones thanks to dithering techniques.

Samsung are particularly good in this regard.
 
Oh man, pleasantly surprised. I bought it before all this HDR hoopla. Didnt think i had it.

Apparently i have to do the following:

Settings > Picture > Expert Settings > HDMI UHD Color

Hdmi uhd colour does not affect hdr. I see a lot of confusion about this.
 
Don't mean to hijack this thread, but I don't know what this is called and it is killing me:

M7Gw0ha.png

What is the "banding" called on the left side of this picture? I got excited when I saw this picture mentioned alongside error diffusion, but unfortunately a quick google search does not lead me to believe they are the same thing.

I'm asking, because I've always wondered in gaming if this means I have something incorrectly calibrated? Is it hardware based? I'd say most frequently it is noticed with dark scenes. I figured this would be the best place to ask this since I'm assuming it has something to do with color range capabilities. It's something that has bothered me since the beginning of the "HD" era I'd say, and I've ran the gamut of different hardware configurations with varying costs ranging from PC to console gaming and I've always noticed it.
 
Don't mean to hijack this thread, but I don't know what this is called and it is killing me:



What is the "banding" called on the left side of this picture? I got excited when I saw this picture mentioned alongside error diffusion, but unfortunately a quick google search does not lead me to believe they are the same thing.

I'm asking, because I've always wondered in gaming if this means I have something incorrectly calibrated? Is it hardware based? I'd say most frequently it is noticed with dark scenes. I figured this would be the best place to ask this since I'm assuming it has something to do with color range capabilities. It's something that has bothered me since the beginning of the "HD" era I'd say, and I've ran the gamut of different hardware configurations with varying costs ranging from PC to console gaming and I've always noticed it.

Not hijacking at all. This is a side-effect of not having enough bit-depth for the colours. You're attempting to reproduce a range colours but they're not available, so you can't get smooth gradients.

If you're seeing this problem in different medium, the also know it can be a side-effect of compression.

Relevantly, if we were to try and show a picture made for 10-bit with our current standard, this is why you lose details. I think the previous page in this thread as like a field where the clouds have more details one way and none the other.

As for what this is called, I think it's just literally called "banding".

Error Diffusion is a technique to stop it from happening as much, since you will use an algorithm to allow pixels to be different colours so you get this illusion where it looks like the original, and thus preserving detail even though you've compressed the image.

https://en.wikipedia.org/wiki/Colour_banding

This is half of why HDR10 is so exciting, because 10-bit colour will remove this problem almost entirely. The increased luminance allowed by SMPTE 2084 (PQ), thus higher luminance, and thus better contrast ratios, will mitigate the reason why you see colour banding in dark scenes. There aren't enough different shades of dark colours (greys and blacks) in our current standard. You probably also notice lots of greens being used in dark scenes, this is a result of other more complicated compression techniques.

---

I don't believe you will get colour banding introduced by poor consumer calibration settings... but if you are noticing it a bunch on your PC it could be because you have it set to High Colour (16bit) and not True Colour (24bit). Although, this is 2016, so I doubt it.
 
You're awesome. Thank you. I swear I had googled every type of "banding" other than color banding because I generally only notice it in dark scenes. You hit the nail on the head in regards to greens and greys used to replicate dark scenes. For the longest time I thought I was having limited/full issues with consoles/tv panels until I realized it had been with me on PC all the while as well.

Out of every HDR thread and their perks, this is probably the biggest sell for me. The visuals are stunning, but its one of those things where as long as it isn't side-by-side I can live without for awhile. To help alleviate color banding? I'm sold.

Thanks for the help :)
 
Thought I’ll add in parts of HDR that I don’t see much people talk about – HDR Static Metadata/Adaptive Metadata.

Static Metadata
Watching a movie with your brightness/contrast set to one setting e.g 50 all the way through the film.

Adaptive Metadata
Watching a movie where frame by frame the brightness/contrast adjusts itself; A great example of this is a candle lit scene from Marco Polo Dolby Vision, the brightness of the candle pixels increase and decrease subtly (don’t confuse this with somebody increasing the contrast/brightness of the whole tv panel, only the pixels that need to increase brightness).

MS/Sony have opted for the HDR10 open format, but HDR10 currently does not support adaptive metadata.

Current State of HDR10/Dolby Vision:
- HDR10 – Requires yet to be finalised HDMI 2.1 specification for delivery of Adaptive Metadata. Xbox One S/PS4 Pro only have HDMI 2.0a.
- Dolby Vision – Supports Adapative Metadata already, but only through streaming services. Dolby Vision UHD Bluray Discs/Players to be released 2017 (http://www.flatpanelshd.com/news.php?subaction=showfull&id=1474359278)

Adaptive Metadata makes a difference, and creators can mimic how brightness/contrast works in real life. We won’t’ be able to experience this in games till … hopefully Scorpio with HDMI 2.1 included?

Xbox One S/PS4 Pro would not be able to support adaptive metadata as it currently only supports HDMI 2.0a, and it’s not known if this can be firmware upgraded to HDMI 2.1.
 
Thought I’ll add in parts of HDR that I don’t see much people talk about – HDR Static Metadata/Adaptive Metadata.

Static Metadata
Watching a movie with your brightness/contrast set to one setting e.g 50 all the way through the film.

Adaptive Metadata
Watching a movie where frame by frame the brightness/contrast adjusts itself; A great example of this is a candle lit scene from Marco Polo Dolby Vision, the brightness of the candle pixels increase and decrease subtly (don’t confuse this with somebody increasing the contrast/brightness of the whole tv panel, only the pixels that need to increase brightness).

MS/Sony have opted for the HDR10 open format, but HDR10 currently does not support adaptive metadata.

Current State of HDR10/Dolby Vision:
- HDR10 – Requires yet to be finalised HDMI 2.1 specification for delivery of Adaptive Metadata. Xbox One S/PS4 Pro only have HDMI 2.0a.
- Dolby Vision – Supports Adapative Metadata already, but only through streaming services. Dolby Vision UHD Bluray Discs/Players to be released 2017 (http://www.flatpanelshd.com/news.php?subaction=showfull&id=1474359278)

Adaptive Metadata makes a difference, and creators can mimic how brightness/contrast works in real life. We won’t’ be able to experience this in games till … hopefully Scorpio with HDMI 2.1 included?

Xbox One S/PS4 Pro would not be able to support adaptive metadata as it currently only supports HDMI 2.0a, and it’s not known if this can be firmware upgraded to HDMI 2.1.


Wth?

There's 7 pages of HDR talk without including what is probably the most important part of HDR going forward? Damn :')


Btw, in another thread we talked about bolded, and onQ123 thought XB1S and PS4P might be able to get Dolby Vision in the future.

Also, HDMI 2.1 could still just be a firmware update (unlikely, but it's possible).

In any case, PS4 Pro not supporting Dolby Vision is a huge loss. Would have been a real standout feature re: HDR.
 
Thought I’ll add in parts of HDR that I don’t see much people talk about – HDR Static Metadata/Adaptive Metadata.

Static Metadata
Watching a movie with your brightness/contrast set to one setting e.g 50 all the way through the film.

Adaptive Metadata
Watching a movie where frame by frame the brightness/contrast adjusts itself; A great example of this is a candle lit scene from Marco Polo Dolby Vision, the brightness of the candle pixels increase and decrease subtly (don’t confuse this with somebody increasing the contrast/brightness of the whole tv panel, only the pixels that need to increase brightness).

MS/Sony have opted for the HDR10 open format, but HDR10 currently does not support adaptive metadata.

Current State of HDR10/Dolby Vision:
- HDR10 – Requires yet to be finalised HDMI 2.1 specification for delivery of Adaptive Metadata. Xbox One S/PS4 Pro only have HDMI 2.0a.
- Dolby Vision – Supports Adapative Metadata already, but only through streaming services. Dolby Vision UHD Bluray Discs/Players to be released 2017 (http://www.flatpanelshd.com/news.php?subaction=showfull&id=1474359278)

Adaptive Metadata makes a difference, and creators can mimic how brightness/contrast works in real life. We won’t’ be able to experience this in games till … hopefully Scorpio with HDMI 2.1 included?

Xbox One S/PS4 Pro would not be able to support adaptive metadata as it currently only supports HDMI 2.0a, and it’s not known if this can be firmware upgraded to HDMI 2.1.


God I really hope we can just get a connect one box upgrade. I know that was the point of it. The problem is how expensive is it going to be.
 
So what I'm taking away from all this, is that there's still no HDR standard yet for the industry, but that we're probably getting close? I get super confused when terms like HDR10 are thrown in, and that even that has two different criteria manufacturers can aim for, and then labels like UHD Premium which seem consumer friendly aren't being adopted by everyone. It means if I walk into an electronics store I have to consider asking staff for advice, and I HATE asking them for advice. I remember a guy at PC World (UK) telling my mum who was looking for a laptop simply for emails and facebook that she shouldn't consider the cheaper models because they won't be able to do emails. I told him where to go.

BAH!

I'm sticking to the standard PS4 though and not upgrading to the Pro. So just got to hope the TV industry gets this mess sorted by the time the PS5 starts rolling out.
 
I posted this in the tv thread but probably should have been here. I recieved my samsung mu6500 65 inch curve tv today from amazon. I assumed with HDR plastered all over the ads that this would work perfectly with HDR ps4 gaming? I keep trying to set it up but ps4 says it does not have this feature? Anyone have some info on this tv?
 
I posted this in the tv thread but probably should have been here. I recieved my samsung mu6500 65 inch curve tv today from amazon. I assumed with HDR plastered all over the ads that this would work perfectly with HDR ps4 gaming? I keep trying to set it up but ps4 says it does not have this feature? Anyone have some info on this tv?

The two most common issues are:

1.) Make sure you have enabled UHD Color on your TV.
2.) In addition, it is likely that only 1 or maybe 2 of your HDMI inputs accepts HDR.

See the KS8000 thread if you need more details on how to do this.
 
Top Bottom