• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

HMDI 2.1 - higher resolution, higher refreshrate, hello adaptive-sync and more

Belker

Member
Well, the last one they released is still $400.

http://www.samsung.com/us/televisio...3500u-one-connect-evolution-kit-sek-3500u-za/

They did skip releasing one in 2016 in favor of updating the OS, but maybe all these spec changes would convince them to do another.

That is a substantial chunk of change. My hope - based on no evidence whatsoever - is that some enterprising Chinese company releases a compatible version at a cut-down price.

Are you keeping your KS8000?
 
Yes, but for HDR10 he using 10bits + 2bits and for Dolby Vision he is using 12bits + 2 bits in his calculations. It should just be 10bits and 12bits per channel in those calculations.

Ah, missed that part. I thought you were just questioning why 12 bits was being calculated instead of 10.
 

Vuze

Member
One of the many reasons you shouldn't have bought a half baked 4K TV last year.
Guess what: at the point in time you can buy a TV with all these features, it will be "half baked" in comparison to the next standard already in the making again :OO
 
Guess what: at the point in time you can buy a TV with all these features, it will be "half baked" in comparison to the next standard already in the making again :OO
Truth.

Technology will never stop advancing unless technology stops.

There will always be something new around the corner.

Someone makes this exact statement every single year. They must still be on the very first TV they ever bought if they truly feel that way.
 
we do not have that now. because ARC only support 5.1 lossy audio.

What you're talking about does not reduce the number of cables. You still need a cable for every device and a cable that goes between the TV and receiver. Whether you have individual cables going to your TV or your receiver doesn't change the number of cables.

Guess what: at the point in time you can buy a TV with all these features, it will be "half baked" in comparison to the next standard already in the making again :OO

I disagree. These things come in waves, and are not a constant linear progression. The reason being content drives their utilization and without standards to adhere to it would be chaos. As long as you hit the key pillars, then, you've hit the majority of the functionality.
 

wildfire

Banned
Game Mode VRR features variable refresh rate, which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing.

And now one of the nicer points of Nintendo's software had over the declining stability of 3rd party software is now going to be rendered moot. I'm glad this problem is going to be fixed.
 

Heel

Member
That is a substantial chunk of change. My hope - based on no evidence whatsoever - is that some enterprising Chinese company releases a compatible version at a cut-down price.

Are you keeping your KS8000?

Yeah, I'll probably keep it for at least a few years. If the next consoles end up supporting adaptive sync, I'll be back in the market.

The only tangible benefit I can see in the interim for actual content is dynamic metadata for HDR10 and 10bit RGB 4:4:4 at 4K60 with HDR. I don't see games and movies supporting more than that for a good while, and even those things are currently theoretical and arguably negligible.
 
So will Scorpio support adaptive v-sync?

I think Scorpio will support it, even if it is done at a later date by a firmware update. The same is probably true for at least the PS4 Pro. From Eurogamer...
Although it [Game Mode VRR] is an HDMI 2.1 feature, the new 48G cable isn't required for today's resolutions - and in theory, this element of the protocol could be retrofitted to existing consoles paired with HDMI 2.1 screens (as we've seen in the past with PS3 3D and PS4 HDR support added to existing consoles via firmware updates).
http://www.eurogamer.net/articles/digitalfoundry-2017-hdmi-2-1-specifications-revealed
 
Truth.

Technology will never stop advancing unless technology stops.

There will always be something new around the corner.

Someone makes this exact statement every single year. They must still be on the very first TV they ever bought if they truly feel that way.

No, you can't make the same argument because each progression and advance in tech is not weighted equally. The focus should be on the pillars and not the smaller nice to have features. The pillars are driven by the content. In the last 9 years since I got my high end 1080p set, what key feature/functionality am I missing? 3D at best and look where that is. Jumping in at the right time with a 1080p set made me set for the last 9 years and likely several more to come until 4K w/ HDR has settled in with a proper set. Then I can jump on that and ride that out another decade plus rather than jumping in too early before the pillars have been set by actually having content available that determines what you really need in a TV. This notion that tech advances every year is a terrible argument and terrible advice to tell someone to just buy now.
 
To use a new HDMI standard what exactly is physically different? New cables? Different ports? Different chip on motherboard?
 
I got it from here:
http://k.kramerav.com/downloads/white-papers/effects_of_color_depth_4.pdf

"Once the Pixel Clock is determined the bandwidth can be calculated with the following formula:
Bandwidth = Pixel_Clock * (bit_depth_per_color + 2)"

I am not sure what the writter is doing here, I don't think that the HDMI spec requires 2 bit padding between each pixel value. Maybe they are trying to compensate for 10b/8b by adding in 2 bits.

To use a new HDMI standard what exactly is physically different? New cables? Different ports? Different chip on motherboard?

New chips on the mobo (TMDS chip), new chip on the TV (TCON), and 48Gbps will require new cables as well for reduced EMI. Maybe some Samsung TVs might be able to upgrade to the 2.1 spec via a One Connect box.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
Why? What do you gain?

Most video cards only have one HDMI port, and for me that will be used up by a VR headset. Would rather not continue to bother with DVI or DP adapters for the times when I decide to hook my PC up to the large TV.

Right now I have no spare HDMI ports either and have to disconnect a device to plug in something new not already in my media setup. Though to be fair that's more to me needing a new AVR with more HDMI ports than anything.
 
Most video cards only have one HDMI port, and for me that will be used up by a VR headset. Would rather not continue to bother with DVI or DP adapters for the times when I decide to hook my PC up to the large TV.

Right now I have no spare HDMI ports either and have to disconnect a device to plug in something new not already in my media setup. Though to be fair that's more to me needing a new AVR with more HDMI ports than anything.

Ah, didn't factor in VR. Even then though, it seems the proper thing to do would be VR to support Display Port since AV equipment has no use for Display Port since it's primarily focused with HDMI. So DP to VR, and HDMI to TV would be the way to go I would think. I think DP would be pretty useless on a TV from a AV enthusiast perspective; most wouldn't utilize it.

Wow that's a lot, why hdr10 is the most used? Its cheaper/easy to do?

Dolby Vision requires additional hardware to be licensed from Nvidia for TV manufacturers to support. HDR10 doesn't.
 
I am not sure what the writter is doing here, I don't think that the HDMI spec requires 2 bit padding between each pixel value. Maybe they are trying to compensate for 10b/8b by adding in 2 bits.

Yeah, I'm not sure. They try to break it down in their calculator, but some of their labels seem meaningless to me (specifically Clock/Pixel). Maybe they are doing some post processing so it requires more than one clock?
https://k.kramerav.com/support/bwcalculator.asp


3840
X
2160
=
8,294,400

Resolution ( Pixels/Frame )

8,294,400
X
120
=
995,328,000
Result ( Pixels/Second )

995,328,000
X
Color Depth Factor ( Clock/Pixel )
1.20
X Coefficient(Bits/Clock)
10 =
Bandwidth Per Channel
11.94 Gbps
=
Total Signal Bandwidth
35.83 Gbps

¯\_(ツ)_/¯
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
Ah, didn't factor in VR. Even then though, it seems the proper thing to do would be VR to support Display Port since AV equipment has no use for Display Port since it's primarily focused with HDMI. So DP to VR, and HDMI to TV would be the way to go I would think. I think DP would be pretty useless on a TV from a AV enthusiast perspective; most wouldn't utilize it.

Yeah, as Durante mentions earlier having VR move towards DP would be the better solution. Though if the ultimate VR plan is to get to a wireless setup, I dunno how that would work. Are there wireless DP solutions? I haven't checked...I'm aware of the HDMI side of things in that regard.
 

Wollan

Member
With the advent of 120hz 4K, Dynamic HDR and VRR-adaptive-sync I think it's to be expected that the PS5 comes with a general purpose checkerboard/scaling programmable chip ala PS4 Pro.
That combined with mature Vega/Ryzen/HBM APU is pretty crazy tools for developers to chase image quality with, pixel counting will be a thing of the past.
 
Yeah, as Durante mentions earlier having VR move towards DP would be the better solution. Though if the ultimate VR plan is to get to a wireless setup, I dunno how that would work. Are there wireless DP solutions? I haven't checked...I'm aware of the HDMI side of things in that regard.

My guess is wireless VR is going to take form of some other protocol/connection rather than taking an existing Display Port/HDMI/DVI connection and making that part wireless. That might be the short term stop gap to do as people hack it in, but for the long term, it's going to be a different form factor to achieve.
 

Planet

Member
a general purpose checkerboard/scaling programmable chip ala PS4 Pro.
It's not a chip, it is a feature of the GPU. And it's "just" a free to use history buffer created with zero cost to rendering power. It provides opportunities like checkerboard rendering but it is in the developers hand to use as they see fit. It's not a fixed function anything. (At least I read the given explanations that way, not claiming inside knowledge)

And yes, I expect it to become standard at least in AMD GPUs, and pretty much guaranteed in the PS5.
 

coughlanio

Member
I think I'll hold out for a HDMI 2.1 TV before I consider any upgrades. I have a Philips BDM4065UC 40" 4K monitor that I use as a TV, which unfortunately only supports HDMI 1.4, but thankfully has DP1.2 which should see me through gaming wise (I only play on PC, may get a Switch too, which I doubt will be a 4K machine).

Exciting things are coming, especially with the likes of Scorpio possibly having HDMI 2.1 support.
 

Kaako

Felium Defensor
With the advent of 120hz 4K, Dynamic HDR and VRR-adaptive-sync I think it's to be expected that the PS5 comes with a general purpose checkerboard/scaling programmable chip ala PS4 Pro.
That combined with mature Vega/Ryzen/HBM APU is pretty crazy tools for developers to chase image quality with, pixel counting will be a thing of the past.
It's not a chip, it is a feature of the GPU. And it's "just" a free to use history buffer created with zero cost to rendering power. It provides opportunities like checkerboard rendering but it is in the developers hand to use as they see fit. It's not a fixed function anything. (At least I read the given explanations that way, not claiming inside knowledge)

And yes, I expect it to become standard at least in AMD GPUs, and pretty much guaranteed in the PS5.
It's crazy how just this HDMI 2.1 spec standard got me hyped as hell for the PS5 lol. Game mode VRR in my veins now, please.
 

Reallink

Member
Guess what: at the point in time you can buy a TV with all these features, it will be "half baked" in comparison to the next standard already in the making again :OO

There do exist technology plateaus that remain current and relevant for several years. 1.0 was relevant for 10 years or more. It was obvious 1.3, 1.4, and 2.0 were doomed specs from the beginning and the sole reason I haven't bought AVR's or TV's the last 3 or 4 years. 2.1 seems very future proof by comparison, the question is if/when displays are actually going to support the features. If they're actually shipping them in Q2, it seems at least the high end 2018's will be equipped, making any 2017 model a fool's buy.
 

Mindman

Member
There do exist technology plateaus that remain current and relevant for several years. It was obvious 1.3, 1.4, and 2.0 were doomed specs from the beginning. 2.1 seems very future proof by comparison, the question is if/when displays are actually going to support the features.

1.3 and 1.4 were not "doomed." They served their purpose for their time and did it well.
 

Chao

Member
Nothing is future proof anymore, jesus, I JUST bought a 2016 receiver and oled TV and they're already obsolete thanks to another fucking new hdmi spec.

Really makes me want to turn on my old crt tv and play genesis and super nintendo games for the rest of my life.
 

Reallink

Member
1.3 and 1.4 were not "doomed." They served their purpose for their time and did it well.

Yes not 1.3, I was confusing 1.3 and 1.4. I'm referring to 1.4's inability to handle more than 30Hz 1080p 3D, which had 3D actually taken off (something no one knew at the time), was absolutely doomed. That was the point it became obvious buying anything you expect to last more than 2 years would be stupid, and they've just continued to trickle out half step, bandwidth starved, missing critical feature iterations since then. For once this actually seems to have to have everything you could reasonably forecast for the foreseeable future.
 
Top Bottom