• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

It's been looking fine though the HBO Go app.

Interesting... So no banding at all? Or is it minimal?

The first episode of this season (very mild S7E1 GoT spoiler )
where they show the walkers slowly walking directly towards the camera, and there is a bunch of mist/smoke/w/e, the banding was absolutely horrendous

So are you referring to an LG HBO Go App?
 

BumRush

Member
Interesting... So no banding at all? Or is it minimal?

The first episode of this season (very mild S7E1 GoT spoiler )
where they show the walkers slowly walking directly towards the camera, and there is a bunch of mist/smoke/w/e, the banding was absolutely horrendous

So are you referring to an LG HBO Go App?

Banding in that scene was insane. My wife - who never notices any image related issues - picked up on it.
 
Banding in that scene was insane. My wife - who never notices any image related issues - picked up on it.

Yes!

And then this past weekend's wasn't amazing either, because most of the episode took place at night or in dark caverns. Honestly, the show would benefit immensely from HDR.

But now I'm super curious if it's an Amazon issue and not an HBO Go issue. I'd happily switch if that's the case.
 

tmdorsey

Member
Interesting... So no banding at all? Or is it minimal?

The first episode of this season (very mild S7E1 GoT spoiler )
where they show the walkers slowly walking directly towards the camera, and there is a bunch of mist/smoke/w/e, the banding was absolutely horrendous

So are you referring to an LG HBO Go App?

I don't have a LG, I have a Sony LED (X930E) and I didn't notice any of that at all.
 
I don't have a LG, I have a Sony LED (X930E) and I didn't notice any of that at all.

Ahhh, yeah that makes sense as i'm guessing the set is using Sony's debanding stuff to ameliorate the issue.

I thought you were saying you had no issue with the LG, which would have been impressive.
 

ToD_

Member
My coworker bought a Sony X900E recently. I helped him with calibration and such when he got it. He stopped by my desk last week complaining about this very scene, and how distracting the banding was. He was concerned the TV was unable to handle smooth gradients.

I also watched it on my Pioneer Kuro KRP500M, and the banding in that scene was terrible.

I imagine it also matters what device you use to stream. Some devices or internal apps may apply some processing (debanding). I hear some (most?) Sony TVs do have a debanding option, so I wonder if my coworker had that disabled.
 
Sony's de-banding system is the best in the business. The X1E improved a lot in that regard too.

Seems like an odd omission from some other mfgs. That being said, literally my only complaint on my C7 is GoT. I don't have issues anywhere else; Netflix/Amazon HDR stuff looks great, PC looks great, it is quite literally one TV show =/
 

Klotera

Member
My coworker bought a Sony X900E recently. I helped him with calibration and such when he got it. He stopped by my desk last week complaining about this very scene, and how distracting the banding was. He was concerned the TV was unable to handle smooth gradients.

I also watched it on my Pioneer Kuro KRP500M, and the banding in that scene was terrible.

I imagine it also matters what device you use to stream. Some devices or internal apps may apply some processing (debanding). I hear some (most?) Sony TVs do have a debanding option, so I wonder if my coworker had that disabled.


The setting is called Smooth Gradation on the X900E. It's good to have on, especially for lower quality sources, but best to try keep it on Low. I can't speak for that exact screen, but the splash screen for Playstation Vue has banding that I can easily see on my other TVs that is much better on my Sony.
 

Belker

Member
You aren't wrong academically speaking, abstracted from real world constraints, but you can look at the last couple pages for my thoughts on 2.1.

At the end of the day, we may not see sets until CES (or later) that even have the functionality, and then it is likely at least 2-3 years before anything really takes advantage of VRR. And lossless via ARC is a huge unknown (there are no streaming services offering lossless today) so how you'd ever utilize that remains to be seen. And this is again setting aside that a streaming box will likely accomplish anything you'd want ARC for anyway.

We aren't talking like 1.4 to 2.0 where it enabled HDR/4K (big deal), we're talking about somewhat esoteric features that, even if enabled, may not necessarily have any industry support anyway.

Just my two cents.


Good points, though the outcome the the person who was thinking about upgrading is the same I think - don't sell the KS8000 yet.

I'm not too fussed about lossless sound, so I'd trade that in for VRR. I wonder how esoteric it will be if the group that controls the standard had decided it's important enough to put and, presumably, console market will have grown.

I'm wondering what 2.2 would do. More bandwidth for more data allowing more resolution and colours etc?
 
Good points, though the outcome the the person who was thinking about upgrading is the same I think - don't sell the KS8000 yet.

I'm not too fussed about lossless sound, so I'd trade that in for VRR. I wonder how esoteric it will be if the group that controls the standard had decided it's important enough to put and, presumably, console market will have grown.

I'm wondering what 2.2 would do. More bandwidth for more data allowing more resolution and colours etc?

I mean someone who is even contemplating upgrading a TV less than a year after buying it is a person who may want to upgrade now and then again in a year or two anyway, but sure.

I can't quite decipher what you're saying in the couple sentences about VRR.

More bandwidth is already in 2.1. 48 gbps (vs 18.2 now).

Here's a quick write up for more info (from the horse's mouth, as it were).

http://www.hdmi.org/manufacturer/hdmi_2_1/

P.s. related to the bandwidth stuff, more frames is cool (i've since buying the C7, moved to 1080p120 over 4K60 for Overwatch on PC), but 8K? I'm not sure who that's for other than front projector users. We're only now starting to see a constant flow of 4K blu rays. God knows when we'd see 8K

Remember that a tv can't just do VRR on it's own. It needs a HDMI 2.1 device connected to it that can also do VRR. In terms of consoles the first device to handle VRR will probably be the PS5, whose release date is? For PC we aren't even sure when Nvidia and AMD will offer graphics cards with HDMI 2.1. We don't even know if Nvidia even cares about HDMI 2.1, because VRR does not equal G Sync. For VRR to work on a nvidia gpu ( which most people own ) there would need to be a g sync chip added to a VRR enabled tv, and how much does this add to cost? Will TV manufacturers even want to manufacturer two different tvs, one with g sync and one without? VRR sounds amazing on paper, but the actual implementation will be iffy.

Very well said. Again, academic versus where the rubber actually meets in the road in the real world.
 

Mrbob

Member
Sony's de-banding system is the best in the business. The X1E improved a lot in that regard too.

Sony TVs also seem the best at scaling non native content too.

Good points, though the outcome the the person who was thinking about upgrading is the same I think - don't sell the KS8000 yet.

I'm not too fussed about lossless sound, so I'd trade that in for VRR. I wonder how esoteric it will be if the group that controls the standard had decided it's important enough to put and, presumably, console market will have grown.

I'm wondering what 2.2 would do. More bandwidth for more data allowing more resolution and colours etc?

Remember that a tv can't just do VRR on it's own. It needs a HDMI 2.1 device connected to it that can also do VRR. In terms of consoles the first device to handle VRR will probably be the PS5, whose release date is? For PC we aren't even sure when Nvidia and AMD will offer graphics cards with HDMI 2.1. We don't even know if Nvidia even cares about HDMI 2.1, because VRR does not equal G Sync. For VRR to work on a nvidia gpu ( which most people own ) there would need to be a g sync chip added to a VRR enabled tv, and how much does this add to cost? Will TV manufacturers even want to manufacturer two different tvs, one with g sync and one without? VRR sounds amazing on paper, but the actual implementation will be iffy.

This is assuming everything works perfectly with HDMI 2.1 release when ever previous new HDMI release has been riddled with issues.

On top of this, just because you have a HDMI 2.1 TV, doesn't mean you can handle spec for certain HDMI 2.1 functions. Like for HDR the nits count needs to increase to hit HDR master spec for Dolby Vision and the panels need to go up from 10bit to 12 bit. HDMI 2.1 release is really just the start.
 

Schlonky

Neo Member
Remember that a tv can't just do VRR on it's own. It needs a HDMI 2.1 device connected to it that can also do VRR. In terms of consoles the first device to handle VRR will probably be the PS5, whose release date is?

DF said that Xbox One X will support HDMI 2.1 VRR. It's very likely based on AMDs Freesync over HDMI spec, which is functional today, just not supported by most displays. I wouldn't be at all surprised to see HDMI 2.1 displays that are able to support VRR from AMD cards that support Freesync over HDMI today.
 
Xbox One X will support HDMI 2.1 VRR. It's very likely based on AMDs Freesync over HDMI spec, which is functional today, just not supported by most displays. I wouldn't be at all surprised to see HDMI 2.1 displays that are able to support VRR from AMD cards that support Freesync over HDMI today.

No, it won't. The spec page clearly calls out that it supports 2.0b for HDMI version.

Your point is still valid about the possibility of Freesync, but it remains to be seen what that actually means in terms of TV support.
 

Salaadin

Member
I keep hearing how the X900E is horrible at handling android tv but my experience hasn't been horrible yet though it's only been a few days. Any measures I can take beforehand to prevent it from getting bogged down?
 
This is why I want a TV with built-in Chromecast (which basically means a Sony). Every damn media app under the sun supports that, so it doesn't matter if there's no app for it on the TV.
Nah, I really like LG OS and I have Roku. Would not want something clunky (tried Amazon Fire before and hated it)
 

Schlonky

Neo Member
http://www.eurogamer.net/articles/digitalfoundry-2017-project-scorpio-supports-freesync-and-hdmi-vrr

Think Sony supporting HDR10 on the OG PS4 without supporting any other feature of HDMI 2.0a.

I wasn't questioning your source. The article is incorrect (and from before the full E3 reveal).

http://www.xbox.com/en-us/xbox-one-x

Click "see all tech specs" in the middle of the page.

This isn't at all surprising btw, since we haven't heard a peep from any mfgs regarding actual CE gear that will implement 2.1 (One X would have theoretically been the first consumer device)

I'm aware it's possible with existing HDMI for some version to be supported (but again, not VRR) so it's just as easy to argue that mfgs could update existing sets to support that technology, so again, we're back to it not being worth waiting for 2.1.

This entire conversation is hypothetical because we know nothing other than bullet points on a page for what 2.1 will support.
 
Banding in that scene was insane. My wife - who never notices any image related issues - picked up on it.

Ditto

It looked like video compression artifacts to me. I doubt you'll see that on the GoT BD whenever it shows up.

Maybe some sets do a better job of squelching it. It was pretty awful on my E6.
 
Ditto

It looked like video compression artifacts to me. I doubt you'll see that on the GoT BD whenever it shows up.

Maybe some sets do a better job of squelching it. It was pretty awful on my E6.

Honestly, it's so weird to me that Netflix and Amazon have, in the last few years, accelerated so much spending on original content (Netflix is like $6B this year) and they're doing it the right way; 4K and HDR, and you have GoT, arguably the single biggest show on TV right now that isn't in either.

It's weird. Someone a few pages back said he heard at a convention or something that they're working on it, but not there yet...
 

RoadHazard

Gold Member
Honestly, it's so weird to me that Netflix and Amazon have, in the last few years, accelerated so much spending on original content (Netflix is like $6B this year) and they're doing it the right way; 4K and HDR, and you have GoT, arguably the single biggest show on TV right now that isn't in either.

It's weird. Someone a few pages back said he heard a convention or something that they're working on it, but not there yet...

Wait, GoT isn't even 4K, let alone HDR?! That is shocking.
 

Lima

Member
Wait, GoT isn't even 4K, let alone HDR?! That is shocking.

No. It's basically a shittier looking 1080p stream compared to the 1080i satellite. The 1080i satellite image is still the best looking. Kinda sad when you think about it. The iTunes version looks horrible too.
 

Mrbob

Member
DF said that Xbox One X will support HDMI 2.1 VRR. It's very likely based on AMDs Freesync over HDMI spec, which is functional today, just not supported by most displays. I wouldn't be at all surprised to see HDMI 2.1 displays that are able to support VRR from AMD cards that support Freesync over HDMI today.

Everything is pointing at Xbox One X being HDMI 2.0b. HDMI 2.1 spec isn't even finalized yet. Regarding PS4 and HDR, Dolby Vision and HDR10 can spec down to HDMI 1.4. I haven't seen anything say VRR can spec down to HDMI 2.0b.

The reason I brought up G Sync and not Freesync is Nvidia owns the PC GPU space. So it's way more important for TVs to support Gsysnc than it is Freesync. It doesn't sound like AMD Vega is going to be AMDs GPU Zen moment. Home Theater PC gaming is expanding, but it is still a relatively niche environment. So now we are looking at TV manufacturers supporting a niche (Home Theater PC Gaming) of a niche (AMD gpu owners). Typically support needs a catalyst and this is why I'm looking at the PS5. Each generation playstation consoles sell tens of millions worldwide, so there is the catalyst. If one manufacturer hops on VRR Freesync right away I could see it being Samsung.

I have no problem with someone waiting on HDMI 2.1 for VRR, but I would just be wary of hopping on a new spec right away and expecting proper support. Especially since each new HDMI launch tends to have issues. Even if there is support it may not work. For someone waiting on HDMI 2.1 I would recommend holding out for second generation TVs with HDMI 2.1 support.
 
I can confirm that the GOT stream is pretty bad in terms of quality. Amazon prime seems to be the only streaming service to support proper 4k with HDR content. Netflix is next, but they seem to lack HDR content. The most mindblowing show I was watching with HDR is the Grand Tour. There was a scene where Jeremy was wearing a matt black jacket setting outside while the sun was shining on his face. The jacket was so dark, while his face was extremely illuminated. It felt very lifelike. I also saw a Dolby Vision movie, and it was incredible how much more colors were being shown, and how bright and dark everything was in the movie. While I have been impressed by movies and TV shows right now, I am actually very disappointed with how games implement HDR. Only Infamous Second Son had a decent HDR mode and still it was a little too dim for my liking. The rest of the games supported a minor resolution bump, but no HDR. Uncharted 4 HDR mode was actually pretty bad with barely any difference. At least it wasn't dim like Infamous. I have been really satisfied with the TCL 55 p607 series so far.
 
Wait, GoT isn't even 4K, let alone HDR?! That is shocking.
There's no 4K broadcast/stream, that's the problem.
Westworld UHD Blu-ray is coming, so I would expect that this latest GoT season has also been produced with 4K HDR in mind.
The 2nd episode had plenty of "flames in the night" moments where HDR would fit perfectly.

HBO as a well known Pay TV service not having any 4K channel is pretty sad.
 

Schlonky

Neo Member
I wasn't questioning your source. The article is incorrect (and from before the full E3 reveal).

http://www.xbox.com/en-us/xbox-one-x

Click "see all tech specs" in the middle of the page.

This isn't at all surprising btw, since we haven't heard a peep from any mfgs regarding actual CE gear that will implement 2.1 (One X would have theoretically been the first consumer device)

I'm aware it's possible with existing HDMI for some version to be supported (but again, not VRR) so it's just as easy to argue that mfgs could update existing sets to support that technology, so again, we're back to it not being worth waiting for 2.1.

This entire conversation is hypothetical because we know nothing other than bullet points on a page for what 2.1 will support.

MS can't publish that One X supports anything regarding HDMI 2.1 or any of it's features officially until it has undergone compliance testing, so I don't think the absence of HDMI 2.1 VRR support in the official specs is any kind of proof that it will not support the feature.

It's also not at all "easy to argue" manufacturers could update existing sets to support VRR. How many PC monitors have you heard of being upgraded to support FreeSync?

If you wait to buy a set that supports HDMI 2.1 VRR, you will definitely have the feature. If you don't than you almost certainly won't. If that feature is important to you, then you should wait, or at least not buy anything you won't be willing or able to replace with something that does.
 

Reallink

Member
Sony TVs also seem the best at scaling non native content too.



Remember that a tv can't just do VRR on it's own. It needs a HDMI 2.1 device connected to it that can also do VRR. In terms of consoles the first device to handle VRR will probably be the PS5, whose release date is? For PC we aren't even sure when Nvidia and AMD will offer graphics cards with HDMI 2.1. We don't even know if Nvidia even cares about HDMI 2.1, because VRR does not equal G Sync. For VRR to work on a nvidia gpu ( which most people own ) there would need to be a g sync chip added to a VRR enabled tv, and how much does this add to cost? Will TV manufacturers even want to manufacturer two different tvs, one with g sync and one without? VRR sounds amazing on paper, but the actual implementation will be iffy.

This is assuming everything works perfectly with HDMI 2.1 release when ever previous new HDMI release has been riddled with issues.

On top of this, just because you have a HDMI 2.1 TV, doesn't mean you can handle spec for certain HDMI 2.1 functions. Like for HDR the nits count needs to increase to hit HDR master spec for Dolby Vision and the panels need to go up from 10bit to 12 bit. HDMI 2.1 release is really just the start.

LOL GPUs will feature 2.1 as soon as it's available and they launch their next product line. The increased bandwidth allows for >60hz 4k and 10/12 bit 4:4:4, as well as dramatically future proofing VR needs. Of course they care about it, don't be silly. Additionally, the whole point of VRR is to get away from the gsync hardware in displays, the functionality and hardware is presumably included in the standardized HDMI 2.1 chipset. Nvidia would have to actively block or disable it, which would be almost assuredly violate HDMI certification regs.
 

Mrbob

Member
Well you sound confident. Id rather air on the side of caution and see how it plays out first without saying anything definitive. The history of new HDMI releases has shown issues in the first year of a new spec.

I hope it works as intended because then I can buy a future Nvidia gpu and run VRR on my 2.1 spec TV. If it runs as well as Gsysnc, great. But my suspicion is Nvidia is going to want to sell Gsync chips.
 

Schlonky

Neo Member
Everything is pointing at Xbox One X being HDMI 2.0b. HDMI 2.1 spec isn't even finalized yet. Regarding PS4 and HDR, Dolby Vision and HDR10 can spec down to HDMI 1.4. I haven't seen anything say VRR can spec down to HDMI 2.0b.

If HDMI 2.1 VRR is just taking the existing FreeSync over HDMI implementation and making it official then I think it's very possible that source devices that support FreeSync over HDMI will not have to change anything to work with it regardless of what version of HDMI they support.

The reason I brought up G Sync and not Freesync is Nvidia owns the PC GPU space. So it's way more important for TVs to support Gsysnc than it is Freesync. It doesn't sound like AMD Vega is going to be AMDs GPU Zen moment. Home Theater PC gaming is expanding, but it is still a relatively niche environment. So now we are looking at TV manufacturers supporting a niche (Home Theater PC Gaming) of a niche (AMD gpu owners). Typically support needs a catalyst and this is why I'm looking at the PS5. Each generation playstation consoles sell tens of millions worldwide, so there is the catalyst. If one manufacturer hops on VRR Freesync right away I could see it being Samsung.

I don't expect to ever see a TV with G-Sync support.

I have no problem with someone waiting on HDMI 2.1 for VRR, but I would just be wary of hopping on a new spec right away and expecting proper support. Especially since each new HDMI launch tends to have issues. Even if there is support it may not work. For someone waiting on HDMI 2.1 I would recommend holding out for second generation TVs with HDMI 2.1 support.

HDMI 2.1 VRR/Freesync is not new tech. It has its roots in the Embedded Displayport specification that came out way back in 2009 where it was originally conceived as a power saving technology for laptop screens. It's not some great technical challenge to implement. In a market where manufacturers are desperate to find ways to differentiate their products I can see a lot of companies seeing value in being able to check the "VRR support" box.
 
Interesting... So no banding at all? Or is it minimal?

The first episode of this season (very mild S7E1 GoT spoiler )
where they show the walkers slowly walking directly towards the camera, and there is a bunch of mist/smoke/w/e, the banding was absolutely horrendous

So are you referring to an LG HBO Go App?

Yeah that was horrible on my OLED. I was watching it through the PS4 app since LG doesn't seem to have an HBO Go app on their store.

Pretty sure it's down to HBO's shitty streaming bitrate

Yes!

And then this past weekend's wasn't amazing either, because most of the episode took place at night or in dark caverns. Honestly, the show would benefit immensely from HDR.

But now I'm super curious if it's an Amazon issue and not an HBO Go issue. I'd happily switch if that's the case.

Issue was present in the PS4 app. I think it's an HBO issue.
 

Mrbob

Member
Great, but lets pump the brakes a second and just see how it plays out over HDMI 2.1, not assume it's all going to work perfectly on day one. That's all I'm really trying to say. Everything is still speculation until we see it in action.

I'll be a sad panda if there is no true Gsync support on future VRR TVs.
 

Belker

Member
Very well said. Again, academic versus where the rubber actually meets in the road in the real world.

I was saying that I'm not sure lossless audio is that important to me. if I was in a position to buy a TV that supported VRR devices (as I already knew was necessary - as I knew an HD TV needed an HD signal) I wouldn't mind if it didn't have lossless audio. My case is a peculiar, in that I only play games with a gaming headset, so I don't have any speakers connected.

As for the bandwidth (etc), I was wondering what 2.2 would bring and that you, or other people, would make suggestions. At the end of 2.1's life, this is the discussion we'll be having.

Since they say it supports 8K, what might its new USP be? Improvements to HDR? Might 2.1 be sufficiently advanced enough that, even if someone were to be an early adopter, it wouldn't matter too much?

I've been given different points of view here and useful insights into the argument - though a lot of what we're all discussing is a bit academic at the moment.

I suspect there have been numerous, useful, replies since I started typing this.
 

GReeeeN

Member
If you can get the C7 now for cost price, do it.

A couple of reasons why:

1) You might be dead before 2018 comes round
2) You might be able to sell the C7 for close to cost in 2018

This is the approach I've taken with buying the Sony A1.

I have a feeling the C7 will sell for a lot less than cost price now in about a years time when HDMI 2.1 is the standard across all TVs.
 

Schlonky

Neo Member
LOL GPUs will feature 2.1 as soon as it's available and they launch their next product line. The increased bandwidth allows for >60hz 4k and 10/12 bit 4:4:4, as well as dramatically future proofing VR needs. Of course they care about it, don't be silly. Additionally, the whole point of VRR is to get away from the gsync hardware in displays, the functionality and hardware is presumably included in the standardized HDMI 2.1 chipset. Nvidia would have to actively block or disable it, which would be almost assuredly violate HDMI certification regs.

This isn't correct. Much of the HDMI spec is optional. Nvidia are almost certainly free to not support the VRR feature of HDMI 2.1 if they don't want to while still being compliant as long as they support all mandatory features.
 

Mrbob

Member
I was going to say can someone show me freesync support on PC for nvidia because what I've read shows Nvidia has done everything possible to block it in favor of gsync.
 

Kambing

Member
HDMI 2.1 will be a thing for 2018. Most displays will take advantage of the increased bandwidth by offering 4k@120hz. There will also most certainly be new HDMI 2.1 capable receivers. Even if the spec supports it, I've got a feeling only 1-2 companies will support VRR (my guess is Vizio with their 2018 P series, and LG with 2018 OLED's). AMD GPU's are probably going to be the only thing that supports VRR through HDMI in 2018.

Support for VRR will increase in 2020 when the PS5 launches, which will probably tout it as a feature of next gen -- can definitely see Sony TV division following suit. The major catalyst or drive for VRR will probably come from the home release of the Avatar sequels in 2021, seeing as the movie will employ VRR in theaters. Heck Cameron spurred 3D in the home, can see the same being the case for VRR.

I hope I am wrong and VRR takes off sooner than later though!
 
Top Bottom