• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

So what IS the HDR standard?

LordofPwn

Member
Thank you OP, if only marketing people would use this instead.

Also concerning Dolby Vision. the TLDR is:
Dolby Vision (10-12bit) is better than HDR10 and is the cinema/pro standard, but if a device is Dolby Vision labeled it also supports HDR10.
 
DV is going to die by nature of the industry.

UHD HDR BRs must always support HDR10. Samsung and Sony aren't supporting it. Consoles will run HDR10.

DV has a long road in front of it with those headwinds. I don't see it becoming much of a factor.

Much like how HDDVD had better features but lost, I don't see much difference with DV


There is no reason to not support both, most studios are already supporting DV. Supporting both formats really has no negative impact on either. Part of the agreement, is that a HDR10 layer will exist on every disc that feature DV. When it comes to stand-alone bluray players it will matter if your UHD blu-ray collection are HDR10 or DV you will be supported even if your display can't decode DV.

Truth is SMPTE and Dolby don't even see them as competing formats. Their only viewed as Open and proprietary.

Sony and Samsung just wants to maintain control over how HDR is processed on they're displays. You can't do DV in Vivid mode, which some people like.

Thank you OP, if only marketing people would use this instead.

Also concerning Dolby Vision. the TLDR is:
Dolby Vision (10-12bit) is better than HDR10 and is the cinema/pro standard, but if a device is Dolby Vision labeled it also supports HDR10.

Yes, this so called format war has been started by tech sites as click bait.
 
Wow Dolby is still around? I still remember back using tapes for music, dolby basically removed the hiss by just muffling the overall sound.

I also remember their ridiculously loud THX sound effect
 

Wensih

Member
So while that link defines what the acronym HDR stands for and what that means, this threads still leaves the question what is HDR unanswered.
 

LordofPwn

Member
Wow Dolby is still around? I still remember back using tapes for music, dolby basically removed the hiss by just muffling the overall sound.

I also remember their ridiculously loud THX sound effect
THX is it's own thing, and dolby has turned away from just audio stuff. you should check out their website.
 

FroJay

Banned
Excited but not excited for these displays since my camera shoots 16 bit with 15 stops of dynamic range. Then you have Red cameras which have even higher dynamic range. I assume that's where they're going in the next 2-4 years.
 
This sounds like the most asinine advance for gaming companies to get behind.

For real. This is really interesting tech and it's fascinating to read about it (kudos to the OP for demystifying it), but the more I read on this the sillier it sounds that both MS and Sony are trying to use this as a marketing gimmick at this day and age. Like someone else said, this is like the "True HD" and "contrast ratio" shit from last decade except taken up to 11.

But both MS and Sony need to sell the mid-gen bump consoles and these are easy things to put on the box. And I'd wager it's working fairly well. We've seen plenty of posts here on GAF with people saying they're gonna go buy a new TV without even knowing half of the stuff found in this thread.
 

Izuna

Banned
Well typically the first thing to do when explaining an acronym is to mention what the acronym stands for. High Dynamic Range, HDR, is...

Especially when HD has been used most frequently for high definition.

Oh right.

I didn't feel like it was important. It's just marketing BS.
 

televator

Member
Wow Dolby is still around? I still remember back using tapes for music, dolby basically removed the hiss by just muffling the overall sound.

I also remember their ridiculously loud THX sound effect

Dolby has been around here always... Where have you been? lol And yeah THX are their own competing brand who also shockingly have standards for Video. THX and Dolby are staples of AV media entertainment.
 
This is a great writeup! Fantastic work. Things are going to get a little more structured with the next hdmi spec, but for now we're stuck with this transitional phase.
 

GeoNeo

I disagree.
i don't get why anyone thinks Dolby shouldn't get paid for bringing HDR to the masses. They paid for the research, all the display manufacturers would have done is, is just spit out higher resolutions. Dolby shared they're research with SMPTE, if not all you would have is Dolby Vision.

Yeah, it sucks they did all the research and showed off in past CES why it was important when TV manufacturers were fucking around with pointless shit. Thanks to Dolby and its research we will be fully out of the SDR era which was outdated as fuck.

Not only is Dolby Vision more future-proof with 12-bit and the potential for 10,000 nits waiting for the displays to catch up, the chip/software also adjusts to your specific display.

Meaning, it decodes the 'dynamic metadata' of the content and then maps it in real-time to the capabilities of the screen. Unlike static HDR10, it adjusts the PQ curve on a scene-by-scene (or frame-by-frame) basis.

Ding ding!

Also, they have authoring tools for movies & games which make the work flow much easier. Best of all the spec was thought out to avoid all the issues "static" HDR10 has.

There is no reason to not support both, most studios are already supporting DV. Supporting both formats really has no negative impact on either. Part of the agreement, is that a HDR10 layer will exist on every disc that feature DV. When it comes to stand-alone bluray players it will matter if your UHD blu-ray collection are HDR10 or DV you will be supported even if your display can't decode DV.

Truth is SMPTE and Dolby don't even see them as competing formats. Their only viewed as Open and proprietary.

Sony and Samsung just wants to maintain control over how HDR is processed on they're displays. You can't do DV in Vivid mode, which some people like.


Yes, this so called format war has been started by tech sites as click bait.

Good old tech blogs got to them clicks in even if they don't explain any of the pitfalls of HDR10 and how these HDR10 only sets people are buying today have a super high chance of being made obsolete by early next year & of course MS & Sony won't say shit, but will be happy to push through support for dynamic HDR10 next year and when owners from this year ask "hey does my set support dynamic HDR" they will simply tell them to check their manufacturers website or buy a display that supports the super awesome dynamic HDR10.
 

GeoNeo

I disagree.
The problem with the Dolby solution is vendors don't want to pay for the licensing when there is an open standard available.

They also don't want to meet any real hardline min spec sheet so they can sell off sub-par "HDR" products to rake in the $ from unsuspecting buyers.
 
I am excited to see these changes finally come about. Considering the old spec was designed with CRTs in mind and we've come a long way since. Now if only we could kill 1080i, looking at you broadcast!
 

Izuna

Banned
It may be marketing BS, but explaining the marketing BS should be the first thing done to help the layman decipher what HDR is and what it does.

If you saw the arguments that are happening the other HDR thread atm, with it's semantic disagreements, you'd wouldn't say so.

Anyway, I'll update the OP to say High Dynamic Range. Why not.

;)
 
Dolby has been around here always... Where have you been? lol And yeah THX are their own competing brand who also shockingly have standards for Video. THX and Dolby are staples of AV media entertainment.

oh interesting I thought that was a dolby brand.

I have 100% focus on videogame technology (like PC's), and do not invest in any kind of cool surround sound systems (wife finds them too loud rolleyes) or latest TV sets. I listen to music on youtube even on the go, have unlimited un-throttled 4G. I love hearing my favorite music sung live, and often there are some great covers that I even prefer so there is no point me ever buying music (digital or physical).

I've been shocked hearing about all this HDR stuff and how important it is to get the right set for PS4 Pro to take advantage of it. Definitely will do a lot of research before upgrading to 4K HDR.
 
The problem with the Dolby solution is vendors don't want to pay for the licensing when there is an open standard available.

Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.

Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.

Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.

This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.

So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.

The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.

Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.

End the end, DV really means no advantage other than brightness, per manufacturer.
 

xevis

Banned
WHAT IS HDR?


  • 8-bit Colour
  • 10-bit Colour

I found this part of your post super confusing, since 8bits allow for 256 values but "8bit" TVs apparently display and 2^24 colours. I had to Google to find out the number of "bits" refers to the number of shades of each of the three primary colours (RGB) that a TV can display. So an "8bit" TV can display 256 x 256 x 256 = ~16M colours. 10bit TVs, by comparison, allow 1024 shades for each primary colour. So 1024 x 1024 x 2014 = ~1B colours.
 

Izuna

Banned
I found this super confusing, since 8bits allow for 256 values but "8bit" TVs apparently display and 2^24 colours. It would be much more helpful to spell out that this refers to the number of shades of each of the three primary colours (RGB). So 256 x 256 x 256 = ~16M. 10bit TVs, by comparison, allow 1024 shades for each primary colour. So 1024 x 1024 x 2014 = ~1B colours.

I do exactly this in my post, under the 8-BIT? 24-BIT? 10-BIT? 30-BIT? heading.

I figured that if anyone recognises that 8bit refers to 2^8 they'd probably just accept the logic. For the layman, I just wanted to show any way to calculate why we know there are 16.7m colours etc.

Since HDR likes to refer to 10-bit I didn't feel like referring to it as 30-bit, especially since HDR is sometimes used to describe 12-bit ala Dolby.

I mean, some readers may not care for indices anyway, I went for what took the less space.

EDIT: instead of google, you could have kept reading. =P
 

Izuna

Banned
This must have been part of a subsequent edit? I had your post cached in my browser for hours before replying.

Hmm, I'm not sure, I wrote this around 4PM my time which was almost 8 or so hours ago. I think I posted with it in, it's in my draft .txt also.

My earliest draft had the heading with just "ikr" underneath. If I did leave it like when I posted yesterday it still should have shown up with the heading at least.

I'm happy you took the time to read it and not skim, I'm sorry that part isn't clear =(
 

Izuna

Banned
Fair enough. I seem to have missed it. Maybe combine the two sections? Nice post otherwise!

I was thinking about removing the math entirely so make it super easy to read haha, but showing at least some proof makes it look more technical I guess. I'll see if I can move it around though.
 

spannicus

Member
Props for this. All of the recent HDR talk is making me want a new TV. I know its all marketing but theres some good info here. Thanks.
 
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.

Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.

Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.

This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.

So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.

The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.

Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.

End the end, DV really means no advantage other than brightness, per manufacturer.

Great summary of the situation! Thanks for this.
 
Do you guys think the LG UH7700 would look nice on HDR? They are on a sale where I live. The KU6300 does not seem to support HDR and the other ones a little expensive for me.
 
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.

Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.

Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.

This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.

So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.

The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.

Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.

End the end, DV really means no advantage other than brightness, per manufacturer.

This post is GOD-like.
 

III-V

Member
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.

Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.

Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.

This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.

So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.

The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.

Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.

End the end, DV really means no advantage other than brightness, per manufacturer.

beautiful
 

Renekton

Member
HDR10 is solidifying itself as the standard and baseline with maybe a 'final' HDR12 to come in a couple of years.
HDR10 is being updated with continous metadata (Dolby Visions greatest strength) with the HDMI 2.1 standard.

Please have FreeSync come too.

Will get the Samsung this year for my PS4 Pro with next TV update in 2019-20.
HDR TV with FreeSync would be amazing.
 

Izuna

Banned
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.

Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.

Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.

This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.

So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.

The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.

Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.

End the end, DV really means no advantage other than brightness, per manufacturer.

This is exciting.
 
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.

Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.

Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.

This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.

So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.

The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.

Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.

End the end, DV really means no advantage other than brightness, per manufacturer.

Very interesting. So if you primarily want a TV to use with the PS4 Pro, then DV wont help the output of the PS4 since it says it supports HDR10 only (so if a TV is more expensive with DV versus HDR10 only, that extra money is wasted in terms of the PS4 Pro's output since the additional "value" of DV isnt supported). Thats unfortunate for Dolby then...
 

Izuna

Banned
Very interesting. So if you primarily want a TV to use with the PS4 Pro, then DV wont help the output of the PS4 since it says it supports HDR10 only (so if a TV is more expensive with DV versus HDR10 only, that extra money is wasted in terms of the PS4 Pro's output). Thats unfortunate for Dolby then...

If the TV supports the Dolby standard, it supports HDR10. The only screens that don't are ones that didn't get an obvious firmware update.
 

BigEmil

Junior Member
They also don't want to meet any real hardline min spec sheet so they can sell off sub-par "HDR" products to rake in the $ from unsuspecting buyers.
HDR10 is not "subpar" compared to DV it's not much of a difference I seen it first hand HDR10 is fine
 

Colbert

Banned
@Cobalt Izuna:

You may also add the technical requirements of an output device like a console to support the known HDR specifications. Would be a great addition.
 
I think it's just HDMI 2.0a no? Which is what we need for 4K 60fps.

PS4 has HDMI 1.4 -- whatever they plan on doing to firmware update their PS4s to HDR remains to be seen, but I don't see why not considering 1080p won't require the extra bandwidth that 2.0+ is for.

Or who knows, maybe they could be bullshitting. That would be hilarious.

For dynamic HDR it seems to require a new HDMI, the presumed 2.1 I looked into some sources but they are some months old, but this one is from May.

According to the Philips HDR whitepaper, the current HDMI 2.0a connector standard is not sufficient for its HDR proposal, and work is proceeding to develop a new HDMI connector that could accommodate it and other things.

Back in January at CES 2016, the HDMI Forum tipped its hand to the on-going development of the next HDMI standard (HDMI 2.1?) when Robert Blanchard, Forum president said: “The next version of the HDMI specification targets 8K resolution, enhanced High Dynamic Range and other important features such as delivering power over the interface to products such as streaming media sticks.”

http://hdguru.com/will-hdr-lead-to-another-new-hdmi-connector/

It seems static HDR is retrocompatible up to 1.4 but dynamic I am not so sure.
 
If the TV supports the Dolby standard, it supports HDR10. The only screens that don't are ones that didn't get an obvious firmware update.

Right but any benefits of Dolby above that of generic HDR10 are lost if using PS4 Pro as the source, since it will just use generic HDR10 on the Dolby set

Or put the other way round, if I only care about PS4 Pro on my set, all things being equal if a Dolby set is more expensive than a HDR10 TV, I might as well get the cheaper HDR10 since I wont see a difference
 
Very interesting. So if you primarily want a TV to use with the PS4 Pro, then DV wont help the output of the PS4 since it says it supports HDR10 only (so if a TV is more expensive with DV versus HDR10 only, that extra money is wasted in terms of the PS4 Pro's output since the additional "value" of DV isnt supported). Thats unfortunate for Dolby then...

Actually...no! The decoder chip or (SOC)system on a chip, the manufacturer will not need another chip like Sony X1 chip, to process DV metadata. The SOC that decodes DV, also decodes HDR10. So cost to Sony or Samsung is minimal.

http://www.displaydaily.com/display-daily/42935-dolby-vision-vs-hdr10-clarified

Sony and Samsung don't want the chip, they prefer they're on, which they have control over. Think about how Vizio is able to produce a display that can display both methods and have FALD Blue leds with Red/Green phosphor coating, 128 zones, and ship it with a tablet as a remote. Rtings ranks it second behind LG OLED E6 or G6 as best HDRTV's on the market(Sony ZD9 probably would knock it down a notch, even more if Sony supported DV.
 

Izuna

Banned
@Cobalt Izuna:

You may also add the technical requirements of an output device like a console to support the known HDR specifications. Would be a great addition.

Until Sony fesses up, I have no clue how to do any explanation that answers htf Sony is patching old PS4s.
 
I'd probably have to go write some code to illustrate a crunched chroma range, though. Most image editors don't allow you to do much with chroma compression besides just dropping the saturation across the image, and that's not what we want.
I'm using Lab colors in photoshop and uses curves to maintain as much of the original tonality with a smooth roll-off while reducing the color volume.
Just like a HDR TV would do with content that falls outside its range.

Let's do a reserve HDR.
ground truth:
groundtruth8qr0r.jpg

reduced lightness range. Mostly affects the clouds.
lightnesscompressionh8s8l.jpg

reduced chromacity range. Notice it only affects the most saturated leaves.
cromacitycompression2ts3e.jpg

reduced both
bothcompressederso7.jpg

reduced both + reduced bit depth by 2 bits (random dither).
A few colors gone and more noise, but nothing drastic.
reducedbitdepthmks44.jpg


http://screenshotcomparison.com/comparison/184119
 
Top Bottom