Marty Chinn
Member
Why are you adding 2 bits per channel to the bit depth per color?
HDR10 is 10 bit, and Dolby Vision is 12 bit.
Why are you adding 2 bits per channel to the bit depth per color?
Well, the last one they released is still $400.
http://www.samsung.com/us/televisio...3500u-one-connect-evolution-kit-sek-3500u-za/
They did skip releasing one in 2016 in favor of updating the OS, but maybe all these spec changes would convince them to do another.
One of the many reasons you shouldn't have bought a half baked 4K TV last year.
So what you're saying is don't buy a 4K TV this year after all?
Dolby vision is a little better?HDR10 is 10 bit, and Dolby Vision is 12 bit.
If 64 times the color palette is "a little bit" then yes.Dolby vision is a little better?
HDR10 is 10 bit, and Dolby Vision is 12 bit.
Yes, but for HDR10 he using 10bits + 2bits and for Dolby Vision he is using 12bits + 2 bits in his calculations. It should just be 10bits and 12bits per channel in those calculations.
This TV has a separate breakout box that could potentially be upgraded, if the TV support those features. Hopefully, it does.
Guess what: at the point in time you can buy a TV with all these features, it will be "half baked" in comparison to the next standard already in the making again :OOOne of the many reasons you shouldn't have bought a half baked 4K TV last year.
What? We have that now. This just makes a few things more flexible.
Truth.Guess what: at the point in time you can buy a TV with all these features, it will be "half baked" in comparison to the next standard already in the making again :OO
Well, that's great.
Given the usual timing between HDMI spec announcements and actual implementation, I don't expect it to be fully supported before 2019 though.
we do not have that now. because ARC only support 5.1 lossy audio.
Guess what: at the point in time you can buy a TV with all these features, it will be "half baked" in comparison to the next standard already in the making again :OO
Game Mode VRR features variable refresh rate, which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing.
That is a substantial chunk of change. My hope - based on no evidence whatsoever - is that some enterprising Chinese company releases a compatible version at a cut-down price.
Are you keeping your KS8000?
So will Scorpio support adaptive v-sync?
Would be a big win for MS if they did since they keep banging the "premium console" drum. We'll wait and see...So will Scorpio support adaptive v-sync?
Nah, I doubt you'll see anything with HDMI 2.1 until 2018 at the earliest.So will Scorpio support adaptive v-sync?
So will Scorpio support adaptive v-sync?
Although it [Game Mode VRR] is an HDMI 2.1 feature, the new 48G cable isn't required for today's resolutions - and in theory, this element of the protocol could be retrofitted to existing consoles paired with HDMI 2.1 screens (as we've seen in the past with PS3 3D and PS4 HDR support added to existing consoles via firmware updates).
http://www.eurogamer.net/articles/digitalfoundry-2017-hdmi-2-1-specifications-revealed
Why are you adding 2 bits per channel to the bit depth per color?
Scorpio? Dream on 😂
Technological innovation never stops. You could make the same argument every year, forever.
Forget the firmware update, this shit will be in at launch. Tell me you believe, Senjutsu.It's most likely going to support this via firmware update at the least, so your amusement may be short lived.
Truth.
Technology will never stop advancing unless technology stops.
There will always be something new around the corner.
Someone makes this exact statement every single year. They must still be on the very first TV they ever bought if they truly feel that way.
I got it from here:
http://k.kramerav.com/downloads/white-papers/effects_of_color_depth_4.pdf
"Once the Pixel Clock is determined the bandwidth can be calculated with the following formula:
Bandwidth = Pixel_Clock * (bit_depth_per_color + 2)"
To use a new HDMI standard what exactly is physically different? New cables? Different ports? Different chip on motherboard?
Can we just move to Displayport, please? Please?
Gah, this is making me not want to make the jump to 4k for a long time. Really thinking I might stick with 1080p for the rest of this gen and go 4k when PS5 comes out.
Hell, I'd be happy if we can get ONE Displayport connector on TVs.
Forget the firmware update, this shit will be in at launch. Tell me you believe, Senjutsu.
Why? What do you gain?
Wow that's a lot, why hdr10 is the most used? Its cheaper/easy to do?If 64 times the color palette is "a little bit" then yes.
Most video cards only have one HDMI port, and for me that will be used up by a VR headset. Would rather not continue to bother with DVI or DP adapters for the times when I decide to hook my PC up to the large TV.
Right now I have no spare HDMI ports either and have to disconnect a device to plug in something new not already in my media setup. Though to be fair that's more to me needing a new AVR with more HDMI ports than anything.
Wow that's a lot, why hdr10 is the most used? Its cheaper/easy to do?
I am not sure what the writter is doing here, I don't think that the HDMI spec requires 2 bit padding between each pixel value. Maybe they are trying to compensate for 10b/8b by adding in 2 bits.
Ah, didn't factor in VR. Even then though, it seems the proper thing to do would be VR to support Display Port since AV equipment has no use for Display Port since it's primarily focused with HDMI. So DP to VR, and HDMI to TV would be the way to go I would think. I think DP would be pretty useless on a TV from a AV enthusiast perspective; most wouldn't utilize it.
Yeah, as Durante mentions earlier having VR move towards DP would be the better solution. Though if the ultimate VR plan is to get to a wireless setup, I dunno how that would work. Are there wireless DP solutions? I haven't checked...I'm aware of the HDMI side of things in that regard.
It's not a chip, it is a feature of the GPU. And it's "just" a free to use history buffer created with zero cost to rendering power. It provides opportunities like checkerboard rendering but it is in the developers hand to use as they see fit. It's not a fixed function anything. (At least I read the given explanations that way, not claiming inside knowledge)a general purpose checkerboard/scaling programmable chip ala PS4 Pro.
With the advent of 120hz 4K, Dynamic HDR and VRR-adaptive-sync I think it's to be expected that the PS5 comes with a general purpose checkerboard/scaling programmable chip ala PS4 Pro.
That combined with mature Vega/Ryzen/HBM APU is pretty crazy tools for developers to chase image quality with, pixel counting will be a thing of the past.
It's crazy how just this HDMI 2.1 spec standard got me hyped as hell for the PS5 lol. Game mode VRR in my veins now, please.It's not a chip, it is a feature of the GPU. And it's "just" a free to use history buffer created with zero cost to rendering power. It provides opportunities like checkerboard rendering but it is in the developers hand to use as they see fit. It's not a fixed function anything. (At least I read the given explanations that way, not claiming inside knowledge)
And yes, I expect it to become standard at least in AMD GPUs, and pretty much guaranteed in the PS5.
Guess what: at the point in time you can buy a TV with all these features, it will be "half baked" in comparison to the next standard already in the making again :OO
There do exist technology plateaus that remain current and relevant for several years. It was obvious 1.3, 1.4, and 2.0 were doomed specs from the beginning. 2.1 seems very future proof by comparison, the question is if/when displays are actually going to support the features.
1.3 and 1.4 were not "doomed." They served their purpose for their time and did it well.