• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

So what IS the HDR standard?

Izuna

Banned
IKPfbVQ.png
x2

DISCLAIMER - the purpose of this thread is to try and define HDR from a technical standpoint, so that even a layman can understand the excitement and necessity of such a standard. you won't walk out of this thread knowing what HDR looks like, but instead understand its benefits. some details here have been simplified and thus won't include super technical details. you won't need to know them for HDR gaming etc. no images are able to show you HDR (or more specifically, a wider colour gamut)

WHAT IS HDR?

HDR10 is an open standard in the industry. It has an odd, hard to remember name. That’s why you probably won’t see “HDR10” listed on many specification sheets or boxes. The TV will simply say it supports “HDR” and you’ll have to assume it supports HDR10 content. ~How-To Geek
  • 10-bit Colour
    10-bit Colour, 2^30 gives a total of 1,073,741,824 (1.07b) colours
    8-bit Colour, 2^24, gives a total of 16,777,216 (16.8m) colours
  • Rec. 2020
    This means you will have available colours much closer to the visible spectrum of our eyes. While HDR Media Profile doesn't specify how much of this should be covered, a UHD Premium label on a display means it has at least 90% of DCI P3...
    hY2lpzh.jpg

    Screens of today use Rec.709 (much smaller) - this, by the way, is why it is important for us not to try and view HDR images directly on our non-HDR screens. The colours will all fall in the wrong place.


SO, WHAT DO WE SEE?

colour_infographic.jpg


On a conventional display, what the game is actually being rendered in is being reduced to fit Rec. 709, so we are losing the ablity to show millions of natural colours that we see in real life. With Rec. 2020 (which is what HDR maps to) we are able to display them.

1.jpg


A good example is to look at a picture of red flame. It would have really bright reds, but in Rec. 709 that red isn't available. So what do we do? At the moment, that red is mixed with blues and greens to make it as bright as we want it. This is the technique we have had to deal with up until HDR.

On an HDR screen (and if this image itself was made for Rec. 2020) the colour of that flame would be closer to how it is in real life.


WHAT'S THE DEAL WITH THIS TALK OF MORE BRIGHTNESS?

This is a hard one. I won't embed them on this page, but throughout GAF you see these horribly innacurate SDR (normal) vs. HDR images where the HDR side is simply brighter and higher saturated. While these images are false, they're not misleading.

This is the answer I have decided to go with:

UHD-Premium-logo-1.jpg


The above logo refers to a specification that may allow a TV to be advertised as UHD Premium. The requirements are:
  • At least 3,840 x 2,160 (4K)
  • 10-bit Colour
  • At least 90% of DCI P3
  • More than 1000 nits (peak brightness) and less than 0.05 nits black level or,
  • More than 540 nits (peak brightness) and less than 0.0005 nits black level (since you don't need high brightness if your blacks are so good, like an OLED)

What's happening is that while HDR10 is an open platform, and contrast ratio requirements aren't part of the specifications (there are none), TV companies are sort of pretending it DOES have minimum specifications -- and that all TVs prior to HDR10 are garbage with low brightnesses and horrible blacks. A lot of the comparisons you are looking at are UHD Premium compliant TVs that are showing their HDR compatibility along with their contrast ratio standards.

Does that mean a non-UHD Premium TV can't do HDR? No. It's just really unlikely that you'll get a screen sporting HDR that will have terrible contrast ratio, although it is certainly possible.

Some additional perspective:
...Philips, too, is sticking to HDR-only branding, although none of its tellys meet the UHD Premium specification anyway; its top-tier ‘HDR Premium’ models only go up to 700nits, with its ‘HDR Plus’ sets coming in around 400nits. ~ WHAT HI*FI


A COMPARISON BETWEEN 8-BIT AND LOWER?

If this looks like a minor difference etc. sure. You could write an entire paper with regard to how we process the images we see. Essentially, we're pretty good at putting colours next to each other to make it appear like another:

f2N356N.png


I used Error Diffision simply to prove a point. Even with just 256 total colours (from 16.8 million), we have come a long way with compression techniques and can do stuff with a limited space so that you wouldn't see much of a difference. You don't need to use all these colours at the same time, it's more about what colours you can use. If you get a scene that needs lots of shades of the same colour, that's where you need more colour depth:

M7Gw0ha.png


Here I used Nearest to prove a point (I'm cheating). Anyway, you can see in this image that there is no way for the limited number of colours to show BFF-chan's face without some serious issues. We only have 256 colours available.

These comparison are completely cheating anyway. If you had a display that could only produce 256 colours, it doesn't mean at once, it means overall. That means our 256 colour display can't produce both images. What you're more likely to get is this:

yHWCLnC.png


So with HDR10, there are some scenes we can show that we currently don't even think of showing. Think back to the picture of the fire, can you see the individual bright reds in the middle of the flame? No, because that information isn't there and our SDR screens couldn't display it anyway. So you have to understand, these images won't even show you anything new on your HDR display because they were shot in 8-bit.

IS IT LIKE LIMITED VS. FULL COLOUR SPACE?

Yes... But the differences are FAR greater for what we call DEEP COLOR:
  • LIMITED : < 8/24-bit : (17 - 235) = 218*218*218 = 10,360,232 (10.3m) Colours
  • FULL : = 8/24-bit : (0 - 255) = 256*256*256 = 16,777,216 (16.8m) Colours
  • DEEP : 10/30-bit : (0 - 1023) = 1024*1024*1024 = 1,073,741,824 (1.07b) Colours


8-BIT? 24-BIT? 10-BIT? 30-BIT?

ikr? 8-bits means 256 shades per colour. Since we use red, green and blue, we sometimes call this 24-bit (8*3) True Colour. 10-bit is 1024 shades etc.


IS HDR10 JUST 10-BIT COLOUR THEN?

No, HDR Media Profile is other stuff (in super simple terms):
  • EOTF: SMPTE ST 2084, or Perceptual Quantizer (PQ), lets us use Rec. 2020 and much higher luminance.
  • METADATA: SMPTE ST 2086, MaxFALL, MaxCLL - This allows the device to tell the screen what to do with the image. This is why your 10-bit monitors can't display HDR content etc.


I WANT THE UNSIMPLIFIED EXPLANATION
---

edit: thank you for the GAFgold!!!!!!!!!!
/s

edit:
HDR in this thread stands for High Dynamic Range. not to be confused with High Dynamic Range (Photography), or High Dynamic Range (post-processing).
 

Skilletor

Member
This is all so frustratingly confusing.

I a samsung UN55KU630DF

I didn't really know anything about HDR yesterday, so I researched yesterday and it seems like my TV doesn't have the "wide color gamut" necessary for HDR.

But on my box it says this:


So after reading this thread, which is great btw, thank you, I don't know what I have. :(
 

JP

Member
Dolby Vision? I'm not sure the full profile will ever become the standard but it's probably worth including it here.
 

Izuna

Banned
This is all so frustratingly confusing.

I a samsung UN55KU630DF

Your TV should be fine, looking at the spec sheet. Is there an option or firmware update you need to do?

Dolby Vision? I'm not sure the full profile will ever become the standard but it's probably worth including it here.

People had issues understanding limited vs. full, I tried my best to simplify everything haha. But guys, Dolby Vision, unlike HDR10, is a specific specification that goes up to 12-bit but also uses Rec. 2020 colour space. Something that can use Dolby Vision should be able to use HDR10, but some recent TVs either require an update from the manufacturer or some moron doesn't let the TV know it's compatible. Dolby Vision is a much higher standard with regards to other things too, like the UHD Premium label.
 

jaaz

Member
Nice, thank you. You might want to add a discussion of Dolby Vision HDR and the current state of the game as it relates to these two competing standards. Since it's a bit similar to the past Blu-Ray vs. HDDVD competing standards.

Edit: Beaten by like 2.5 seconds. :0
 

Soodanim

Gold Member
Based on how long I had my last TV for and me having owned my new one for 6 months, I'll look forward to when I get a compatible TV in about 8 years. Monitor in 3-5, probably.

But I do like the idea and I imagine everything looks gorgeous. Thanks OP.
 

Skilletor

Member
Your TV should be fine, looking at the spec sheet. Is there an option or firmware update you need to do?

There isn't.

I think the TV looks amazing, but, like I said, I wasn't sure what HDR was before this week, and all of the explanations online are just very confusing.

Thank you :)
 
With all the buzz words flying around these days, it's good to find an easy-to-digest informative post. Thank you!

I wonder how much further with color fidelity we can go after this. The Dolby Vision talks are getting me interested.
 

Izuna

Banned
How does this help me if all I watch is anime?

Honestly? It won't, not for a long time. If anything, anime won't dare use any colours that aren't available with Rec. 907, and even if they do, it's important. Tone mapping helps realism, for cartoons I don't think anyone would care about a shirt being in a slightly different colour.

Unless you're talking about anime with high production value that uses 3D rendering in some scenes or other post processing effects. THEN you'll have improved IQ. Not that anyone should ever, ever care.
 
Dolby Vision? I'm not sure the full profile will ever become the standard but it's probably worth including it here.

Yeah! What about the HDR WARS???

DOLBY VISION vs HDR10? :p

Dolby vision seems to edge out in better performance or something like that I believe. But since tvs would require to purchase the license, it seems like it'll fail. :(

OP: Will you include the current apps that utilize HDR10 and Dolby Vision like Vudu, Amazon and Netflix? or am I asking for too much :p
 

kiunchbb

www.dictionary.com
This is all so frustratingly confusing.

I a samsung UN55KU630DF

I didn't really know anything about HDR yesterday, so I researched yesterday and it seems like my TV doesn't have the "wide color gamut" necessary for HDR.

But on my box it says this:



So after reading this thread, which is great btw, thank you, I don't know what I have. :(

Same situation for my Sony 850c, it advertised as hdr10, but according to rtings, since the TV doesn't have local dimming, it won't support all color in HDR 10 since there are not enough brightness. I wish I know about this before getting the TV.

Thanks for the thread.
 
Manufacturers pushing too much shit at once is the problem.

I'm all about tech but I'm just going to stay away for a few more years until the dust settles.

Great thread btw.
 

GeoNeo

I disagree.
Make sure to warn people HDR10 standard atm is static metadata & next year TVs will be moving to dynamic HDR10 which is huge. (Will get pushed out with new HDMI 2.1 spec.)
 

Izuna

Banned
Manufacturers pushing too much shit at once is the problem.

I'm all about tech but I'm just going to stay away for a few more years until the dust settles.

Great thread btw.

I think we have to blame the fact that there are manufacturers that want to sell you 4k screens with terrible contrast ratios. People would buy one, then think all high-end 4k is shit or not great.

So something like UHD Premium sticker came about, but it costs money to go through the certification (Pioneer won't touch it with a 10-foot pole rn). HDR10 was an open format, a solution for this, but manufacturers are trying their best achieve it as cheap as possible. The good thing is, it's not cheap enough.

My favourite thing about HDR10 is that we should get rid of super trash screens with it. Sadly, it only requires 4:2:0 decoding afaik, which sucks, but I'm not in the industry so there's probably a financially sound reason for this.

Same situation for my Sony 850c, it advertised as hdr10, but according to rtings, since the TV doesn't have local dimming, it won't support all color in HDR 10 since there are not enough brightness. I wish I know about this before getting the TV.

Thanks for the thread.

well there it is dot gif
 
Make sure to warn people HDR10 standard atm is static metadata & next year TVs will be moving to dynamic HDR10 which is huge. (Will get pushed out with new HDMI 2.1 spec.)

Whoa, this is huge news. So with the dynamic metadata, will it now be 12bit instead of 10bit? or something totally different?

ok I found an article:
http://www.flatpanelshd.com/news.php?subaction=showfull&id=1463138030

Seems like it may be able to become updated via firmware. and essentially just becoming what dolby vision is already doin now :p
 
D

Deleted member 59090

Unconfirmed Member
If a monitor I plan on getting is supposed to cover 125% of the sRGB scale how does that compare to HDR? Is sRGB just rec709 in that graph there?
 

Izuna

Banned
My real question is...Can the OG PS4 really do this? Or is it some more limited software HDR that they meant?

Sure it can. It's more to do with the encoder and port. I think Sony is doing some wizardry with the encoder and realised that HDR10 doesn't care about resolution, so they can save bitrate on that side of things.

That's all we need for more colour depth, telling our devices to send it through and have enough power to do so. 10-bit colour has been achievable for a long time.

Make sure to warn people HDR10 standard atm is static metadata & next year TVs will be moving to dynamic HDR10 which is huge. (Will get pushed out with new HDMI 2.1 spec.)

You just did. It's not something anyone can really prepare for though.
 

GeoNeo

I disagree.
Whoa, this is huge news. So with the dynamic metadata, will it now be 12bit instead of 10bit? or something totally different?

It's pretty much HDR10 copying Dolby Vision so you have scene by scene updates to TV telling it what it should be doing regarding many aspects of the final image.

HDR10 is a fucking shit show right now because it has one set of data for a whole movie/game does not take into account aspects like diff scenes (example day, night, sunset, etc) so You are stuck with whatever static data they decide to throw out.

HDR10 fucked consumers over & knowing this industry and spending fuck ton of money in it (over $50,000) I fully expect early HDR10 adapters to get shit on CES 2017.

UHD alliance is full of shit too. That is what happens when you let the manufacturers set the rules so they can label shit how they want to sell more low end TVs that have no place being certified as a display that cant truly express the vision & research Dolby had & did for high dynamic range.
 
I've always liked pointing to the Giant Bomb video set as a good example of one of the things that HDR takes care of. I think they blow their colours out like this on purpose, though - in this example, it does give a nice streak of colour, but let's assume that if they could avoid it they would.


On the background lights, you can see the colours often get clamped, either going from a gradient to a solid green, or the violet light turning magenta and then white, or the blue lighting turning teal and then white as well.

Take the blue light as an example: in reality it would be mostly blue, and contribute just a little bit of red or green. Let's say the colour values are (0.1, 0.2, 1.0) for (red, green, blue), and the intensity of that light varies from 0 to 10 as the light goes from the edge to the center. At the edge, the light intensity, multiplied by the light colour, will give a result of (0.05, 0.1, 0.5), so the light looks as blue as it would look in real-life, but just look kind of dim. Nearer the center of the light's frustum, though, that intensity will become a value of 4, so the combined light colour will be (0.4, 0.8, 4). However, on low-dynamic-range media, that last value will have to be clamped to just a value of 1, so the value that you see is actually just (0.4, 0.8, 1.0). This makes the light turn more teal, because the blue channel has become as bright as is permitted. The value as a whole is also just "less blue" - instead of the blue channel being 400% more intense as the green channel, it's now only 20% more intense. And that's without the light not even bearing anywhere near its full intensity! Closer to the center, the green channel will be simiarly clamped, and start appearing less intense relative to the red channel, until eventually everything hits (1.0, 1.0, 1.0).

For HDR media and displays, that clamping doesn't happen, so the combined light colour stays at (0.4, 0.8, 4), or (1.0, 2.0, 10.0) and the light stays as blue as it is meant to be all of the way through the light's area of coverage.

In the presentations and discussions I've seen of HDR, there's an ending amount of talk and bluster "everything looks more vibrant" or "everything is more colourful" fell away instantly when I saw an HDR demo that, at one point, had someone wearing a dress of purple sequins like this:


Instead of the dress just reflecting white, it reflected sparkles that were actually purple, and I had never seen a TV display colours that looked like that.



But, more simply, HDR media on an HDR display just ends up looking how you'd expect a new TV to look.
 
This is helpful, thanks OP. I have been wondering this for a while. I do some HDR photography where you take pictures at multiple exposures and stitch them together, but never really understood how the video version worked.
 

Izuna

Banned

You can see that photoshop background without banding in 8-bit if the GPU wasn't trying to send 10-bit colour.

This is helpful, thanks OP. I have been wondering this for a while. I do some HDR photography where you take pictures at multiple exposures and stitch them together, but never really understood how the video version worked.

Oh I have a Lumia 950 XL. I love it specifically for how good it's HDR feature is for a phone (it lets me use a slider): http://i.imgur.com/GX1UoNY.jpg

iPhone camera app stitches the different exposures together automatically for you, and saves the non-HDR image as a separate file. They need to steal from Microsoft stat (also living images is much better stored on Lumia too but that's less important).

I'm sure you're not talking about phone cameras though haha.

HDR (photography) != HDR Media Profile, for anyone else reading
 

wildfire

Banned
I applaud this OP. It's almost right at convey the key points simply. I think the flame example is the one that feels stale because it relies on us using our memory because not too many people spend their time looking at an open flame after childhood.

I knew there was a difference between OLED and LED but I didn't realize it was because of the black levels. Thanks for clearing that up.
 

GeoNeo

I disagree.
Is there any technical reason 1080 displays can't use HDR?

None.

If manufacturers wanted they could put out 10 bit HDMI 2.0a sets that supported HDR with advance backlighting (in the case of LCD) displays it's not tied to resolution.

Dolby Pulsar high dynamic range mastering monitor is 1080p & puts out 4,000 nits & is water cooled.
 

Izuna

Banned
Is there any technical reason 1080 displays can't use HDR?

Nope. Although screens have been using 10, even 12-bit colour for years.

HDR requires the other stuff: metadata and SMPTE ST 2084, which we didn't bother with until after 4k became our target. That's why there aren't any. You'll see 1440p HDR monitors early next year I bet.

I applaud this OP. It's almost right at convey the key points simply. I think the flame example is the one that feels stale because it relies on us using our memory because not too many people spend their time looking at an open flame after childhood.

I knew there was a difference between OLED and LED but I didn't realize it was because of the black levels. Thanks for clearing that up.

I originally used a picture of a flame that had visible reds, and the explanation still worked, but I wanted the image to work for my later example too haha. People can just take my word for it, that looking at fire in real life shows them bright colours their screen cannot yet reproduce. =)

if you find a better image for me to use let me know, I went through pages of google images

edit: i need to proof the OP... omg
 

longdi

Banned
Does HDR not require a full array back lighting?
Those are more expensive and bulky than current edge lighting.

Will PC monitor ever get 'real' HDR?
I fear the smaller size requirements will mean either more expensive tiny LED array or we have to live with blooming..
 
It's pretty much HDR10 copying Dolby Vision so you have scene by scene updates to TV telling it what it should be doing regarding many aspects of the final image.

HDR10 is a fucking shit show right now because it has one set of data for a whole movie/game does not take into account aspects like diff scenes (example day, night, sunset, etc) so You are stuck with whatever static data they decide to throw out.

HDR10 fucked consumers over & knowing this industry and spending fuck ton of money in it (over $50,000) I fully expect early HDR10 adapters to get shit on CES 2017.

UHD alliance is full of shit too. That is what happens when you let the manufacturers set the rules so they can label shit how they want to sell more low end TVs that have no place being certified as a display that cant truly express the vision & research Dolby had & did for high dynamic range.

So I can say that I kind of have a future proof TV for the moment since they made it with Dolby Vision in mind (12bit-Dynamic HDR). (2016 Vizio P-Series). As in I don't see an issue if they add HDR10 Dynamic later since the SOC can already support it. They already added Static HDR10 last month.
 

JP

Member
Is there any technical reason 1080 displays can't use HDR?
I'm not aware of any 1080p TVs that do. I think it's always been seen as a 4K "thing" but maybe as it becomes more common it will start to happen.

The main issue with pretty much everything to do with 4K at the moment is that it's still very much in flux. It could well be four or five years before you can walk into a store and buy a decent TV without having to worry about everything that's going on and whether it's going to be the best TV for the equipment that you already and the services that you're intending to use.
So I can say that I kind of have a future proof TV for the moment since they made it with Dolby Vision in mind (12bit-Dynamic HDR). (2016 Vizio P-Series). As in I don't see an issue if they add HDR10 Dynamic later since the SOC can already support it. They already added Static HDR10 last month.
Unfortunately, not really. The versions of Dolby Vision that you get in current sets are nowhere near final specifications of the Dolby Vision profile. Even the best screens just aren't even close to doing the things that the Dolby Vision profile will be doing in a few years, the technology just doesn't exist at the moment.
 

EvB

Member
So I can say that I kind of have a future proof TV for the moment since they made it with Dolby Vision in mind (12bit-Dynamic HDR). (2016 Vizio P-Series). As in I don't see an issue if they add HDR10 Dynamic later since the SOC can already support it. They already added Static HDR10 last month.

Not really.

The Dolby vision component only works with the built in Streaming services on the P Series.

Also, the P Series only reaches 600 Nits.

UHD Premium requires 1000 as the minimum, Dolby is Proposing that in future content will go as high as 10,000
 

GeoNeo

I disagree.
So I can say that I kind of have a future proof TV for the moment since they made it with Dolby Vision in mind (12bit-Dynamic HDR). (2016 Vizio P-Series). As in I don't see an issue if they add HDR10 Dynamic later since the SOC can already support it. They already added Static HDR10 last month.

Yes, you have a much more "future proof" TV. Netflix, Vudu, Amazon support Dolby Vision plus the SOC is much more felxable.

I'm not saying you'll get dynamic HDR support but if the Vizio & LG 2016 sets that do support Dolby Vision did get updated with Dynamic HDR I would not be shocked.

All manufacturers really need make sure dynamic HDR gets added to all HDR TVs but simply put in this industry they mostly care about their next customer. The sad news was that the paper Philips wrote RE Dynamic HDR only mentioned HDMI 2.1 which locks out all current TVs unless they were forward compatible from get go.

CES is gonna piss off fuck tons of HDR10 only owners or finally let them breathe if companies insure they will update 2016 models.
 

GeoNeo

I disagree.
Unfortunately, not really. The versions of Dolby Vision that you get in current sets are nowhere near final specifications of the Dolby Vision profile. Even the best screens just aren't even close to doing the things that the Dolby Vision profile will be doing in a few years, the technology just doesn't exist at the moment.

It will always map properly to his tv scene by scene mastering already is being done on 4,000 nit displays with great results at home with Dolby vision. As a standard it was well thought out & researched to insure sets don't become useless when final goal of 10,000 nits is hit.

edit: of course Sony, Samsung, and the like don't want to pony up shit to Dolby so rushed out a incomplete "standard" that is greatly compromised.
 

JP

Member
It will always map properly to his light output scene by scene mastering already is being done on 4,000 nit displays with great results at home with Dolby vision. As a standard it was well thought out & researched to insure sets don't become useless when final goal of 10,000 nits is hit.
Luckily, I've not suggested otherwise.

EDIT:
People won't download a firmware update to turn their current screens into 10,000 nits screens anymore than people could download a firmware update to change their SD TVs into HD TVs. It's simply about hardware that doesn't exist at the moment and it could be two years, three years or more before it does exist in what may well only the very top level of screens rather than consumer screens.
 

EvB

Member
edit: of course Sony, Samsung, and the like don't want to pony up shit to Dolby so rushed out a incomplete "standard" that is greatly compromised.

It won't just be them, it will be film studios, distributors and what we more interested in, game developers.

you can see how the open source open is more attractive right now. Especially when consumer technology is not even close
 

jaaz

Member
It will always map properly to his tv scene by scene mastering already is being done on 4,000 nit displays with great results at home with Dolby vision. As a standard it was well thought out & researched to insure sets don't become useless when final goal of 10,000 nits is hit.

edit: of course Sony, Samsung, and the like don't want to pony up shit to Dolby so rushed out a incomplete "standard" that is greatly compromised.

And who do you think ends up paying for those Dolby royalties? I much prefer an open standard--particular one not hardware based--even if it takes some time to develop.
 

PFD

Member
Interesting thread, thanks for this

I was planning on buying an ultrawide 1440P X34P (when it comes out). But now I'm not sure if I should wait for PC HDR screens to release. I was told that PC monitors won't be getting HDR for a long time due to their smaller size (more difficult to implement HDR), is that true?
 

Izuna

Banned
Interesting thread, thanks for this

If it helps eliminate those horrible comparisons, that's all I need it to do.

But you're very welcome =)

Thanks to the others for expanding too

I was planning on buying an ultrawide 1440P X34P (when it comes out). But now I'm not sure if I should wait for PC HDR screens to release. I was told that PC monitors won't be getting HDR for a long time due to their smaller size (more difficult to implement HDR), is that true?

Early 2017, we'll get them. It's not hard to produce an HDR Monitor, we just have to give them the proper decoder (which means security sadly). It has nothing to do with size.

Microsoft and nVidia have your back, but you need at least a Pascal device to use HDR on PC (since prior GPUs didn't think to include it).
 

GeoNeo

I disagree.
And who do you think ends up paying for those Dolby royalties? I much prefer an open standard--particular one not hardware based--even if it takes some time to develop.

People have no problem paying royalties for HDMI, Dolby Digital, DTS, Etc.

Also, new SOCs from the likes of Mediatek support it nativity now because they meet certain performance standard.

Consoles would have no problem supporting it all comes down to them not wanting to pay a cent & being in control of the standard so they don't need to pass certification process.

Though who knows maybe MS will add it for Scorpio.

If a deal was done up Sony could support Dolby vision too but all comes down to them. I'd be happy to pay to "unlock" Dolby Vision support for my 4K console.
 
Top Bottom