• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

There won't be a thing like an 'endgame' display for the next 8 years at least.

There won't ever be an 'endgame' display that you keep for 20 years anymore.

But HDMI 2.1 is going to be a big deal in the next 2-3 years. The thing is every big new HDMI release doesn't just mean buying a new TV, it also means buying a new receiver and new source devices like game consoles and video cards. This is how they keep us spending money, which is why it's in the best interests of the consumer electronics industry to never make an 'endgame' product.
 

GeoNeo

I disagree.
HDMI 2.1 is a big shift in the industry. Adding support for VRR and having enough bandwidth for 4K @ 120Hz 4:4:4 is huge due to pretty much most of these panels having a native refresh rate of 120Hz, but being limited in the past to 60Hz @ Native res due to bandwidth constraints.

For anyone that has ever experienced a high refresh rate display you know it's very hard to go back to 60Hz.

These TV's plus high refresh rate support, VRR, and Dynamic HDR / DV will truly be a sight to behold.

The other great thing is in recent years a lot of the top manufactures are paying a lot of attention to keeping input lag low in game mode. It was so annoying in the past when manufactures gave no fucks. I can honestly see input lag of next years sets clock in at under 16.67ms (1frame @ 60Hz). I have low input lag monitor and I have no issue switching between these new UHD displays that have low 20ms input lag and between my "tournament" monitor for fighting games. I thank reviewers paying more attention to input lag along with leo bodnar for creating a easy device to test for input lag for this shift in manufacturers attitude.
 
A TV bought in 2018 could very well have HDMI 2.1, but even it won't take full advantage of the connection's potential. And a TV bought today will be able to display video to its own full potential using HDMI 2.0. In other words, lack of HDMI 2.1 is no reason to put off buying a new TV.

So while HDMI 2.1 is all about the future, it's still worth learning about today -- if only to warn you against buying overpriced HDMI cables labeled as "2.1." Here's a look at what the new spec is, and more importantly, what it isn't.
https://www.cnet.com/news/hdmi-2-1-what-you-need-to-know/


Soo......no reason to wait.
 
I'll still wait. I'm not overly fussed about the majority of 2.1, but VRR / Scorpio on the horizon has changed things slightly. Hopefully by the time Scorpio comes out we might hear which current sets (if any) will support it. I'd hate to buy now and find that even though bandwidth wasn't an issue, the TV I bought wasn't going to be compatible with such a feature.
 

Theonik

Member
HDMI 2.1 is a big shift in the industry. Adding support for VRR and having enough bandwidth for 4K @ 120Hz 4:4:4 is huge due to pretty much most of these panels having a native refresh rate of 120Hz, but being limited in the past to 60Hz @ Native res due to bandwidth constraints.
The wait for cards that can run games at 4K 120fps will be palpable. If you are willing to sacrifice res though many 2017 sets will accept 1080p 120 fps already.

I'll still wait. I'm not overly fussed about the majority of 2.1, but VRR / Scorpio on the horizon has changed things slightly. Hopefully by the time Scorpio comes out we might hear which current sets (if any) will support it. I'd hate to buy now and find that even though bandwidth wasn't an issue, the TV I bought wasn't going to be compatible with such a feature.
VRR isn't a core part of the spec in the sense that just because a TV supports 2.1 doesn't mean it can accept VRR input and infact, implementation details of VRR are manufacturer dependent such that there will be good VRR displays and shit ones. That's why AMD made Freesync.
 

GeoNeo

I disagree.
The wait for cards that can run games at 4K 120fps will be palpable. If you are willing to sacrifice res though many 2017 sets will accept 1080p 120 fps already.

SLI Volta Ti is what I plan to go with. :D Along with top of the range set (about 10K budget) that supports 4K @ 120Hz and good implementation of VRR.
 
VRR isn't a core part of the spec in the sense that just because a TV supports 2.1 doesn't mean it can accept VRR input and infact, implementation details of VRR are manufacturer dependent such that there will be good VRR displays and shit ones. That's why AMD made Freesync.

True, but you'd expect a flagship TV from a major manufacturer that supports the full 2.1 spec to accept VRR. It'd be weirder if they didn't in my opinion.
 

Theonik

Member
True, but you'd expect a flagship TV from a major manufacturer that supports the full 2.1 spec to accept VRR. It'd be weirder if they didn't in my opinion.
VRR is harder than people realise to correctly implement. Especially on an HDMI context where audio sync is involved.

SLI Volta Ti is what I plan to go with. :D Along with top of the range set (about 10K budget) that supports 4K @ 120Hz and good implementation of VRR.
SLI causes micro stutter though... I'm probably going dual Volta Titan but still.
 

Paragon

Member
The other great thing is in recent years a lot of the top manufactures are paying a lot of attention to keeping input lag low in game mode. It was so annoying in the past when manufactures gave no fucks. I can honestly see input lag of next years sets clock in at under 16.67ms (1frame @ 60Hz). I have low input lag monitor and I have no issue switching between these new UHD displays that have low 20ms input lag and between my "tournament" monitor for fighting games. I thank reviewers paying more attention to input lag along with leo bodnar for creating a easy device to test for input lag for this shift in manufacturers attitude.
The thing that a lot of people seem to forget is that VRR means that you can eliminate V-Sync lag too.
So it's removing latency from the game side of things too, not just the display side of things.
A VRR display with 30ms latency has lower input lag than a fixed refresh display with 20ms latency using V-Sync.
But ideally input lag is always going to be as low as possible. G-Sync monitors are typically <5ms.

SLI Volta Ti is what I plan to go with. :D Along with top of the range set (about 10K budget) that supports 4K @ 120Hz and good implementation of VRR.
Kind of defeats the purpose of using a variable refresh rate display when you're going to have 'microstutter' and an extra frame of lag with a multi-GPU setup.

VRR is harder than people realise to correctly implement. Especially on an HDMI context where audio sync is involved.
You say that like G-Sync displays don't already support audio.

OLED eliminates a lot of the issues with VRR too.
OLED should not have problems with very low refresh rates so they should theoretically support native 24-120Hz or even lower than that.
No need to worry about variable overdrive with OLED's <1ms response times.
 

GeoNeo

I disagree.
SLI causes micro stutter though... I'm probably going dual Volta Titan but still.

I know but it's gotten a ton better. On my current SLI 1080 setup it's very minimal to non existent. I did notice that Windows 10 does a much better job of it in these 10 series cards over Windows 7 which would stutter like crazy in games. All tested on 1440p 120Hz display.

Kind of defeats the purpose of using a variable refresh rate display when you're going to have 'microstutter' and an extra frame of lag with a multi-GPU setup.

Unless somehow Volta is crazy fast & I don't need SLI I'll have to live with it. Though, I do play a lot of older games that these newer cards can push to crazy high frame rates while still rendering at high resolution. Though, lets say Volta card can avg around 90fps on titles at 4K I'd be happy to stick with single card next year. Wait and see I guess.

Edit: Just remembered NV would have to also support VRR on these new TV's with Volta I wonder if they are bone headed enough not to.
 

tokkun

Member
Edit: Just remembered NV would have to also support VRR on these new TV's with Volta I wonder if they are bone headed enough not to.

My expectation is that they won't support it for a couple years. The market of people who have HDMI 2.1 VRR capable TVs connected to PCs is going to be a small niche for a while. They probably make more money on G-sync than they would lose even if that entire group bought AMD instead. And we have already seen them make this exact calculus with FreeSync.
 

Theonik

Member
You say that like G-Sync displays don't already support audio.

OLED eliminates a lot of the issues with VRR too.
OLED should not have problems with very low refresh rates so they should theoretically support native 24-120Hz or even lower than that.
No need to worry about variable overdrive with OLED's <1ms response times.
Different protocols. Also that nVidia solved it doesn't mean that every manufacturer will bother. It's a hard problem and every manufacturer will have to make their own solutions to it. The appeal of G-Sync was exactly that they didn't have to.
 

KevinG

Member
I bought Planet Earth on Bluray when it came out, to show off the quality of the format from my PS3.

Never actually finished watching through the series.

Over a decade later and I want to buy Planet Earth II so badly, even though I'll probably never get around to watching through it.
 

Yukstin

Member
I got the Sony UBP x800 along with Planet Earth 2 last night.

One, the Sony player solves the issues I had with the Xbox S, it's whisper quiet in its operation and my receiver remote works with the Sony player so I don't have to use the cumbersome Xbox controller to control movies. I haven't played enough content to notice a picture quality difference between the devices but overall, I like what Sony has done so far.

Second, HOLY CRAP, Planet Earth 2 on my C6 is amazing. The PQ is so much better than any 4K streaming source.
 
I don't get it.

If a 2018 TV has 2.1 . . . Then it will be ready to take advantage of the spec as soon as the other components in the Home Theater realm are able. Yeah that may not be next year, but when it does happen, your TV will be ready.

How is this not a reason to wait on spending thousands on a mid to high end set?

Part of the problem is 2.1 as a spec is still working through being finalized. Only then, will we have any idea what mfg support looks like.

If you read the write up on the HDMI website, the cable needs to be reengineered to support 48gbps but the connector is unchanged from 2.0

That means, if mfgs had the desire, they could potentially update 2.0 devices to support some/all of 2.1 features.

Now, that's unlikely to be best for business, and there may be HW limitations as well. I'm not sure that adding VRR is as simple as a firmware update.

Point is, it isn't outside the realm of possibility and I don't think we know enough yet.
 
There won't ever be an 'endgame' display that you keep for 20 years anymore.

But HDMI 2.1 is going to be a big deal in the next 2-3 years. The thing is every big new HDMI release doesn't just mean buying a new TV, it also means buying a new receiver and new source devices like game consoles and video cards. This is how they keep us spending money, which is why it's in the best interests of the consumer electronics industry to never make an 'endgame' product.

This is what I think will ultimately push me to next year. I wanted a new TV + new receiver. I'd rather not buy both new when HDMI 2.1 is right around the corner.
 

vpance

Member
Sony A1 vs Samsung KS9590

Some interesting notes
-A1 can reach over 400w power draw! Crazy. Wonder if that goes for LG too
-Clear motion BFI setting on low: 47ms, High: 55ms input lag
-Backlight 12 on the Sammy roughly equals A1 light level in HDR

Probably some other tidbits in there but I don't know German.
 

holygeesus

Banned
Sony A1 vs Samsung KS9590

Some interesting notes
-A1 can reach over 400w power draw! Crazy. Wonder if that goes for LG too
-Clear motion BFI setting on low: 47ms, High: 55ms input lag
-Backlight 12 on the Sammy roughly equals A1 light level in HDR

Probably some other tidbits in there but I don't know German.

That power draw is exactly why these sets are never going to blind you with brightness. They have tamed the ABL but at a cost to your entry bill it seems.
 

vpance

Member
That power draw is exactly why these sets are never going to blind you with brightness. They have tamed the ABL but at a cost to your entry bill it seems.

Yeah, it's just fact of life for OLED. Imagine the wattage if they didn't give a damn about regulations or image retention. Next stop QLED I guess.

1000w OLED 1500 nits. Would you jump in? Probably have to move it to the basement and crank up the AC.
 

vivftp

Member
2018 is gonna be really exciting. I'm personally going to be going with new 2018 OLED + Nvidia Volta PC setup. Actually, I would not be shocked if Sony showed off a consumer CLEDIS which I gladly pay fuck ton for lol.

I'd LOVE to see a consumer CLEDIS display, but I fear we may be a ways off from that. The current CLEDIS panels seem to be extremely expensive, draw a ton of power and have a fair bit of cooling on them. Given how tiny the actual LEDs are though, I don't see why they couldn't pack them closer, so long as heat buildup isn't a huge problem.

Almost makes me wonder how Sony managed to build that 1080p version all those years ago.
 

Thorrgal

Member
I do. I used to use the TV speakers but I used the AV ones for a while and when I switched back the TV speakers were horrible even for just simple TV shows. So now I always use the receiver. Doesn't take any extra work - with HDMI-CEC when I turn my TV on the receiver turns on too, and the TV remote volume buttons control the receiver

Same here
 

mrklaw

MrArseFace
damn guys, all this 2017 models looking like world beaters is making me sad, havent even had my B6 a week :(

Bought a new TV? Carefully researched which one to get? Paid a good price for it?

Great. Now run the fuck away from threads like this for a couple of years at least.
 

tokkun

Member
I don't get it.

If a 2018 TV has 2.1 . . . Then it will be ready to take advantage of the spec as soon as the other components in the Home Theater realm are able. Yeah that may not be next year, but when it does happen, your TV will be ready.

How is this not a reason to wait on spending thousands on a mid to high end set?

HDMI 2.1 includes support for 8K/60. How many TVs released in 2018 do you think will be "ready" to display 8K? Probably none of them. Parts of the spec will be treated as optional, and we don't know whether they will be supported by the first generation devices.
 

e90Mark

Member
Bought a new TV? Carefully researched which one to get? Paid a good price for it?

Great. Now run the fuck away from threads like this for a couple of years at least.

Solid advice.
The only thing not giving me buyer's remorse is my C6 has 3D and I use it frequently, haha.
 

Zhao_Yun

Member
Bought a new TV? Carefully researched which one to get? Paid a good price for it?

Great. Now run the fuck away from threads like this for a couple of years at least.

Haven't even bought a new TV yet and I already know that I will just do that after buying one.
 

DRB

Neo Member
Why is game mode greyed out (lg b6)? Is there a default setting somewhere that should be turned off? I've tried switching hdmi inputs and assigning them as game consoles, but that hasn't seemed to have done anything.
 
Why is game mode greyed out (lg b6)? Is there a default setting somewhere that should be turned off? I've tried switching hdmi inputs and assigning them as game consoles, but that hasn't seemed to have done anything.

Did you connect an audio device via bluetooth? I recall reading that this disables game mode for some reason.
 
damn guys, all this 2017 models looking like world beaters is making me sad, havent even had my B6 a week :(

What did you pay for it? I think the B6 is the right choice at this time. Honestly don't think the 2017 sets are worth it based on price/performance over the B6. Especially the A1E. Hella happy with mine. I got it relatively cheap and it is an amazing set. Best set I've owned, period. Since it didn't break the bank I'll just upgrade in 2019 when LG adds HDMI 2.1. Move the B6 to another room or pass it off to the fam.
 

Theonik

Member
Yeah, it's just fact of life for OLED. Imagine the wattage if they didn't give a damn about regulations or image retention. Next stop QLED I guess.

1000w OLED 1500 nits. Would you jump in? Probably have to move it to the basement and crank up the AC.
I'd jump. My PC eats more than that.
 

Madness

Member

Not bad, the image inprovements and motion are great, but brightness being lower than the C7 as well as costing much more means LG still the OLED to get in 2017. Good first effort by Sony though. I don't know why 1080p input lag listed as such a large negative. Honestly if you are buying an A1E and gaming at 1080p, I don't understand. Maybe for PC'ers? Otherwise 4K and HDR 29-30ms is pretty good for a first attempt. DCI not as high as I would like. Here is hoping they do surprise with a Z9E or something later.
 
Not bad, the image inprovements and motion are great, but brightness being lower than the C7 as well as costing much more means LG still the OLED to get in 2017. Good first effort by Sony though. I don't know why 1080p input lag listed as such a large negative. Honestly if you are buying an A1E and gaming at 1080p, I don't understand. Maybe for PC'ers? Otherwise 4K and HDR 29-30ms is pretty good for a first attempt. DCI not as high as I would like. Here is hoping they do surprise with a Z9E or something later.

Well for me the Switch can't output in 4K or if you hook up any older console to it as well. Sounds like Sony wins on motion but the aggressive ABL they use make the screen much darker than it otherwise could be.
 

vpance

Member
Well for me the Switch can't output in 4K or if you hook up any older console to it as well. Sounds like Sony wins on motion but the aggressive ABL they use make the screen much darker than it otherwise could be.

From the review they say if you turn off extended DR and calibrate to 130 nits you get no ABL.

130 is pretty dim though. But basically it becomes the 4K plasma that many have dreamed of, lol.
 
Does the B7/C7 even have BFI? That's a nice thing to have, I'm guessing the A1E loses a fair bit of brightness without being able to make it up, if you use it with HDR...
 
Top Bottom