DV is going to die by nature of the industry.
UHD HDR BRs must always support HDR10. Samsung and Sony aren't supporting it. Consoles will run HDR10.
DV has a long road in front of it with those headwinds. I don't see it becoming much of a factor.
Much like how HDDVD had better features but lost, I don't see much difference with DV
Thank you OP, if only marketing people would use this instead.
Also concerning Dolby Vision. the TLDR is:
Dolby Vision (10-12bit) is better than HDR10 and is the cinema/pro standard, but if a device is Dolby Vision labeled it also supports HDR10.
THX is it's own thing, and dolby has turned away from just audio stuff. you should check out their website.Wow Dolby is still around? I still remember back using tapes for music, dolby basically removed the hiss by just muffling the overall sound.
I also remember their ridiculously loud THX sound effect
Wow Dolby is still around? I still remember back using tapes for music, dolby basically removed the hiss by just muffling the overall sound.
I also remember their ridiculously loud THX sound effect
So while that link defines what the acronym HDR stands for and what that means, this threads still leaves the question what is HDR unanswered.
So while that link defines what the acronym HDR stands for and what that means, this threads still leaves the question what is HDR unanswered.
how so?
This sounds like the most asinine advance for gaming companies to get behind.
Well typically the first thing to do when explaining an acronym is to mention what the acronym stands for. High Dynamic Range, HDR, is...
Especially when HD has been used most frequently for high definition.
Well typically the first thing to do when explaining an acronym is to mention what the acronym stands for. High Dynamic Range, HDR, is...
Especially when HD has been used most frequently for high definition.
Wow Dolby is still around? I still remember back using tapes for music, dolby basically removed the hiss by just muffling the overall sound.
I also remember their ridiculously loud THX sound effect
i don't get why anyone thinks Dolby shouldn't get paid for bringing HDR to the masses. They paid for the research, all the display manufacturers would have done is, is just spit out higher resolutions. Dolby shared they're research with SMPTE, if not all you would have is Dolby Vision.
Not only is Dolby Vision more future-proof with 12-bit and the potential for 10,000 nits waiting for the displays to catch up, the chip/software also adjusts to your specific display.
Meaning, it decodes the 'dynamic metadata' of the content and then maps it in real-time to the capabilities of the screen. Unlike static HDR10, it adjusts the PQ curve on a scene-by-scene (or frame-by-frame) basis.
There is no reason to not support both, most studios are already supporting DV. Supporting both formats really has no negative impact on either. Part of the agreement, is that a HDR10 layer will exist on every disc that feature DV. When it comes to stand-alone bluray players it will matter if your UHD blu-ray collection are HDR10 or DV you will be supported even if your display can't decode DV.
Truth is SMPTE and Dolby don't even see them as competing formats. Their only viewed as Open and proprietary.
Sony and Samsung just wants to maintain control over how HDR is processed on they're displays. You can't do DV in Vivid mode, which some people like.
Yes, this so called format war has been started by tech sites as click bait.
The problem with the Dolby solution is vendors don't want to pay for the licensing when there is an open standard available.
The problem with the Dolby solution is vendors don't want to pay for the licensing when there is an open standard available.
Oh right.
I didn't feel like it was important. It's just marketing BS.
It may be marketing BS, but explaining the marketing BS should be the first thing done to help the layman decipher what HDR is and what it does.
Dolby has been around here always... Where have you been? lol And yeah THX are their own competing brand who also shockingly have standards for Video. THX and Dolby are staples of AV media entertainment.
The problem with the Dolby solution is vendors don't want to pay for the licensing when there is an open standard available.
WHAT IS HDR?
- 8-bit Colour
- 10-bit Colour
I found this super confusing, since 8bits allow for 256 values but "8bit" TVs apparently display and 2^24 colours. It would be much more helpful to spell out that this refers to the number of shades of each of the three primary colours (RGB). So 256 x 256 x 256 = ~16M. 10bit TVs, by comparison, allow 1024 shades for each primary colour. So 1024 x 1024 x 2014 = ~1B colours.
I do exactly this in my post, under the 8-BIT? 24-BIT? 10-BIT? 30-BIT? heading.
.
This must have been part of a subsequent edit? I had your post cached in my browser for hours before replying.
Hmm, I'm not sure, I wrote this around 4PM my time which was almost 8 or so hours ago. I think I posted with it in, it's in my draft .txt also.
Fair enough. I seem to have missed it. Maybe combine the two sections? Nice post otherwise!
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.
Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.
Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.
This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.
So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.
The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.
Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.
End the end, DV really means no advantage other than brightness, per manufacturer.
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.
Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.
Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.
This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.
So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.
The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.
Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.
End the end, DV really means no advantage other than brightness, per manufacturer.
So as a layman, stay the fuck out of the way for a couple years til things settle down? Gotcha.
Do you guys think the LG UH7700 would look nice on HDR? They are on a sale where I live. The KU6300 does not seem to support HDR and the other ones a little expensive for me.
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.
Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.
Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.
This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.
So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.
The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.
Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.
End the end, DV really means no advantage other than brightness, per manufacturer.
HDR TV with FreeSync would be amazing.HDR10 is solidifying itself as the standard and baseline with maybe a 'final' HDR12 to come in a couple of years.
HDR10 is being updated with continous metadata (Dolby Visions greatest strength) with the HDMI 2.1 standard.
Please have FreeSync come too.
Will get the Samsung this year for my PS4 Pro with next TV update in 2019-20.
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.
Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.
Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.
This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.
So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.
The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.
Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.
End the end, DV really means no advantage other than brightness, per manufacturer.
Imagine you have a brand, and the success of your brand depends on your product standing out. So, in all manufacturers view in order to do this. You get Dynamic Contrast Ratio( Black Crush), LED TV(LCD still). Then you accompany that with over saturated colors. Grayscale that has a blue tint, because pushing blue up makes the picture appear brighter. Even though it makes colors incredibly inaccurate. This has been working for you, and has been working for you. Your product is either number 1 or 2 display product on the market.
Dolby then shows up with HDR at CES, it wows on lookers. Display manufacturers, immediately see this as the new display tech, 4k wasn't moving product like the transition from SD to HD.
Dolby informs, Sony,Samsung, Panasonic, Vizio and LG that DV will require a decoder chip and a licensing fee. Dolby then reveals the Golden Reference, this chip will determine how to properly display DV using Golden Reference files, based on the overall capabilities of each display. Native nits, contrast, contrast ratio and percentage of Rec 2020.
This is a problem, because if you tamper with backlight, color, and brightness settings Dolby Vision will not display properly. Dolby does this because of the differences in nits between reference displays used to color grade HDR. Fox color grades at 1000 nits using Samsung displays. Dolby Vision reference display does 4000 nits.
So now your on proprietary chipset is useless when it comes to HDR, you can't over saturate colors or add any additional post processing to make your display stand out.
The problem with HDR10, if your display nits is below 1000 nits, you can expect to get arfifacts. Anyone with a LG OLED watching UHD blu-ray can attest to this. Even Vizio displays only do 600 nits. The solution to this is tone mapping, a algorithm that makes sure that detail isn't loss when a display can't reach the color graded nits when metadata request it.
Differences between the two methods is Dolby Vision has done the research on tone mapping and has the proper algorithm. So it's in the dynamic metadata. This insures that no matter what title or studio you get a artifact free experience. HDR10 tone mapping is per display algorithm, so it's all over the place.
End the end, DV really means no advantage other than brightness, per manufacturer.
Very interesting. So if you primarily want a TV to use with the PS4 Pro, then DV wont help the output of the PS4 since it says it supports HDR10 only (so if a TV is more expensive with DV versus HDR10 only, that extra money is wasted in terms of the PS4 Pro's output). Thats unfortunate for Dolby then...
HDR10 is not "subpar" compared to DV it's not much of a difference I seen it first hand HDR10 is fineThey also don't want to meet any real hardline min spec sheet so they can sell off sub-par "HDR" products to rake in the $ from unsuspecting buyers.
I think it's just HDMI 2.0a no? Which is what we need for 4K 60fps.
PS4 has HDMI 1.4 -- whatever they plan on doing to firmware update their PS4s to HDR remains to be seen, but I don't see why not considering 1080p won't require the extra bandwidth that 2.0+ is for.
Or who knows, maybe they could be bullshitting. That would be hilarious.
According to the Philips HDR whitepaper, the current HDMI 2.0a connector standard is not sufficient for its HDR proposal, and work is proceeding to develop a new HDMI connector that could accommodate it and other things.
Back in January at CES 2016, the HDMI Forum tipped its hand to the on-going development of the next HDMI standard (HDMI 2.1?) when Robert Blanchard, Forum president said: The next version of the HDMI specification targets 8K resolution, enhanced High Dynamic Range and other important features such as delivering power over the interface to products such as streaming media sticks.
If the TV supports the Dolby standard, it supports HDR10. The only screens that don't are ones that didn't get an obvious firmware update.
Very interesting. So if you primarily want a TV to use with the PS4 Pro, then DV wont help the output of the PS4 since it says it supports HDR10 only (so if a TV is more expensive with DV versus HDR10 only, that extra money is wasted in terms of the PS4 Pro's output since the additional "value" of DV isnt supported). Thats unfortunate for Dolby then...
@Cobalt Izuna:
You may also add the technical requirements of an output device like a console to support the known HDR specifications. Would be a great addition.
I'm using Lab colors in photoshop and uses curves to maintain as much of the original tonality with a smooth roll-off while reducing the color volume.I'd probably have to go write some code to illustrate a crunched chroma range, though. Most image editors don't allow you to do much with chroma compression besides just dropping the saturation across the image, and that's not what we want.