• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

Theonik

Member
There is no HDMI 2.1 specification and when there is it almost certainly won't change cable requirements. Cables certified as "High Speed" HDMI are all that's required, but it is still possible to get a bad cable even if it passes certification so a trusted source is worthwhile.
High speed HDMI cables that were sufficient to carry HDMI 1.4 signal are not sufficient for 4K60 signals as transmitted over HDMI 2.X, at least not all of them are. Higher frequencies are more susceptible to crosstalk and outside interference so the manufacturing standard needs to be higher.

e:
Price killed plasma. OLED has got to get price down too. You can't even buy an OLED at Costco because of price.

If Plasma had stayed around the IR would have been fully resolved if the 2008 Kuros are any indication. I've still yet to experience IR on my calibrated Kuro, but I did on my 2007 Kuro and I've experienced it on a friend's Panasonics, so it's not like I'm blind to IR.

Price>>>Weight>>IR
I don't think OLED yields are progressing fast enough to compete with LCD in the foreseeable future. When you can get a $400 55" LCD what can OLED do?
 

LeleSocho

Banned
the agenda of some people to talk about burn-in or decay rate on OLEDs is astounding. I don't really get it.
Agenda? Oh my god...
OLED has true and legitimate issues but the vast majority of people continue to suggest these expensive panels anyway with the risk that the buyer ends up frustrated with these problem because someone in a forum said that these problems "are not a thing anymore"

It's ok if for you personally the IQ of OLED outweights its problems but there's a sea of difference between that and "modern OLED are perfect, the problems were solved XX years ago"

Price killed plasma. OLED has got to get price down too. You can't even buy an OLED at Costco because of price.

If Plasma had stayed around the IR would have been fully resolved if the 2008 Kuros are any indication. I've still yet to experience IR on my calibrated Kuro, but I did on my 2007 Kuro and I've experienced it on a friend's Panasonics, so it's not like I'm blind to IR.

Price>>>Weight>>IR
My 2007 Kuro is still going strong with no IR/burn-in, let me tell you it will be a sad sad day when it'll start to noticeably wear down and present the issues.
 

III-V

Member
What's the deal with HDMI 2.1? I don't trust this round of TVs due to not being able to do 4k/60 4:4:4 with HDR

HDMI 2.1 will likely only add support for dynamic metadata. I don't think anyone is expecting 4k/60 444 with HDR from what I have read.
 

Kyoufu

Member
Agenda? Oh my god...
OLED has true and legitimate issues but the vast majority of people continue to suggest these expensive panels anyway with the risk that the buyer ends up frustrated with these problem because someone in a forum said that these problems "are not a thing anymore

Not quite. OLED does have issues, just like every other display technology on the planet, but what's funny as an OLED owner is seeing people go on and on and on about image retention when LG has taken steps to make it a non-issue when they should really be concerned about screen uniformity. You think OLED buyers are exchanging their sets because they had IR/burn-in? No, they're exchanging them because vertical banding and vignetting are the real issues, and guess what? That is another area OLED has improved on since last year, but it's still something that causes the panel lottery experience for some buyers and if I were to buy another OLED today, I would be pretty nervous about what I end up with.

Thankfully, I ended up with a good panel on my first try, but I'd be fooling you if I said you'd have the same experience as me. Chances are you might not. Whereas I've used the TV for thousands of hours across my consoles and PC and I've never once thought about IR, because LG's compensation cycles, screensavers and screen dimming all do the thinking for me.
 
If you have a plasma display, that makes sense. Those things were power hogs.
However OLEDs being emissive displays are not extremely efficient themselves either.
That's part of the reason they have such an aggressive brightness limiter.
The most efficient type of display is an edge-lit LCD, if your concern is power consumption over anything else.

That said, it is almost never worthwhile to replace working hardware with something else for the sake of efficiency or "the environment".
It takes far more resources to manufacture and ship a new display than it does to stick with what you have got and keep using it until it dies.
A more efficient product rarely pays for itself over the useful lifetime of the product.

People should be replacing dead hardware with more efficient options, not buying new hardware to replace perfectly functional but less efficient hardware.
In most cases anyway, I'm sure there will always be exceptions.

Is that actually what your bill says, or an estimate based on the power supply ratings? Because saving ~$45/month would be at the extreme side of things.
Very unusual to be saving that much when replacing like-for-like.

I see! Thanks for explaining. Is the brightness limiter the reason why it took all these years for us to get OLED? I remember hearing that it was a massive limitation to not have each individual diode die prematurely, and this was the reason why they could only make small displays for a long time?


Well replacing the TV was the only change I did in October, so a few weeks ago when I saw the energy bill from november, I was surprised it was so much lower than usually. My new Monitor is a LG as well- a 34-inch ultrawide. I have been using my 60-inch plasma for almost a year as a monitor. I couldn't afford to get a new monitor when the old one broke. It could also be my energy company charges less in winter. They've done that in the past. hmmmm. That actually sounds plausible. energy consumption goes up in winter so some energy companies balance out the expected rate to give a lower cost in winter.




Still, if your plan is to replace that plasma with a $4000 65" OLED, it would take ~7.5 years to pay for itself, which is probably longer than most people intend to keep their displays.
And that doesn't factor in its own running costs either, which means it's going to take even longer to pay for itself.

Of course there are other reasons to want to upgrade from a plasma to a new TV, but like I said, it rarely ever saves you money to switch to something more efficient when you work out the full cost vs how long you expect the useful lifespan of the product will be.



A power supply rating is its maximum load, not its power draw.
The system will only draw what it needs.
Power supplies are at their most efficient around 50% load so if it's drawing 500W—which is very high for a gaming PC these days unless you're using SLI—then your system is probably at peak efficiency under load which is ideal.
With a 1000W power supply you are probably not going to be in the most efficient range when idle, but again it is almost never worth the cost of replacing it for that.
At idle loads the difference in efficiency is only a few watts, not hundreds, and it would take decades for a power supply to pay for itself.
As long as the PSU is 80PLUS rated, it's not something you have to think about. Higher wattage power supplies are usually the most efficient.
Considering that a power supply is something you should be replacing every 5-10 years anyway (always replace them once the warranty expires) that will never happen.

My PC is almost 7 years old. When I build it had 2 Geforce 470 in SLI- These where super overclocked based on the Nvidia Fermi arcitecture. These where loud and they got hot. They where the worst power/thermal performances in Nvidias history, and it was after this generation they went into power efficency!

I think the big change that shows how far we've come is that the 970 I have in it now, is so cool, so silent and it draws little power. And the CPU - i7 4790 also draws a lot power than the old i7 950. So in these 7 years I just stuck with the same Corsair 1000Watt power supply.



Burn-in/Image Retention is all about accumulative wear on emissive displays, not continuous use.
Continuous use will just cause it to happen sooner.

When I got my 60 inch LG PK950 Plasma in 2010 it was on the dawn of the first consumer 3D TVs. Avatar had come out and the new models was going to have 3D. So I thought I would buy this top of the line LG Plasma- It was more affordable in the wake of the 3D TV releases. So I go to AVSforum- The guys who are the leading authority on TVs (?) and they tell me, and hundreds of others that image retention is barely a problem. All you have to do is go through the break in cycle the first 100 hours.
The problem is when you get a TV like this, the problems don't show in the beginning, and sometimes shipping a product like this (or a dishwasher, or a fridge) is just going to be extremely expensive.
I've been happy with the performance of my TV. The colors and contrast where amazing. Compared to the faded pastal colored tinted bitch LCD displays of 2010 this one was a winner. But fuck me, has it been annoying and stressful paying 4000 USD for a product that needed babying, that had this energy consumption and expelled this clicking noise because it was overheating (a common problem on these).



I want a big ass screen, but I want reliability. If some of the manufactureres would test the TVs for the first 100-200 hours for a break-in phase that would make me more assured. The other thing- if you're playing games or using it as a monitor retention is going to be a problem.
My image retention is permanent. It might not be burn in, but those logos never went away. it's very very faint but it still bums me out.
 

Weevilone

Member
Agenda? Oh my god...
OLED has true and legitimate issues but the vast majority of people continue to suggest these expensive panels anyway with the risk that the buyer ends up frustrated with these problem because someone in a forum said that these problems "are not a thing anymore"

It's ok if for you personally the IQ of OLED outweights its problems but there's a sea of difference between that and "modern OLED are perfect, the problems were solved XX years ago"

It starts looking like an agenda when certain posters mention burn-in nearly every time the post i the thread.

And I haven't seen anyone pitch OLED as perfect. A perfect display has never existed, not even close.
 
It starts looking like an agenda when certain posters mention burn-in nearly every time the post i the thread.

And I haven't seen anyone pitch OLED as perfect. A perfect display has never existed, not even close.

I have hundreds of posts in the old TV thread. I defy you to find all of the hundreds of posts in the old thread and prove they are all about burn-in.

This is a thread about TVs on a gaming forum. For gamers, IR and burn-in are much bigger problems than for people who watch movies or even ESPN with the logo on the screen all the time when not at commercials. It's not unreasonable to care about burn-in if you're a gamer unless you enjoy having game HUDs permanently on your TV which is what happened to my VT60.
 

Kyoufu

Member
I have hundreds of posts in the old TV thread. I defy you to find all of the hundreds of posts in the old thread and prove they are all about burn-in.

This is a thread about TVs on a gaming forum. For gamers, IR and burn-in are much bigger problems than for people who watch movies or even ESPN with the logo on the screen all the time when not at commercials. It's not unreasonable to care about burn-in if you're a gamer unless you enjoy having game HUDs permanently on your TV which is what happened to my VT60.

Good point. I don't play video games.

A good article on QLEDs, what could possibly be the future of Ultra HD TVs and beyond https://www.cnet.com/news/how-qled-tv-could-help-samsung-finally-beat-lgs-oleds/

I hope Samsung do something with this, and Sony bring CLEDIS to consumers.
 

Paragon

Member
I see! Thanks for explaining. Is the brightness limiter the reason why it took all these years for us to get OLED? I remember hearing that it was a massive limitation to not have each individual diode die prematurely, and this was the reason why they could only make small displays for a long time?
No, the brightness limiter has been a factor in all emissive displays.
What I meant by that comment is that they are still having to limit the brightness of the panel significantly when large areas of it are illuminated.
That's why the peak brightness of the current OLEDs is about 800 nits when only 10% of the screen is lit, but when you display a full white screen it drops to 150 nits.
It would actually be quite interesting if someone measured the power consumption, but I would expect that it's similar for both scenarios.

LCDs have typically not had any form of brightness limiter in place, in part due to their efficiency.
The new FALD HDR LCDs do now though, due to the higher brightness that HDR is pushing.
Samsung's KS9000 LCD has a peak brightness of 1500 nits and that drops to about 500 nits for full-screen white.

Generally the more efficient a display gets, the less they have to limit the brightness.
You could always just go brighter and consume more power, but I think that's starting to be quite highly regulated in many places now so nothing is likely to have anything like plasma power consumption again.

OLEDs are certainly going to be more efficient than plasmas, but perhaps not as efficient as LCDs yet.
Rtings lists average power consumption for the 55" E6 OLED as 96W and maximum as 146W, while the Samsung 55KS9000 is 53W/142W.

I don't know whether it's based on size or not, but after checking a few reviews it looks like 55" TVs at least are limited to a maximum of around 150W by whatever regulations are in place now for example. (seems like rtings prefers 55" models)

Well replacing the TV was the only change I did in October, so a few weeks ago when I saw the energy bill from november, I was surprised it was so much lower than usually. My new Monitor is a LG as well- a 34-inch ultrawide. I have been using my 60-inch plasma for almost a year as a monitor.

Ah well it's not exactly like-for-like when going to something 73% smaller, and a monitor will be edge-lit which is very efficient.
The difference would not have been so dramatic if you had replaced it with a 65" display, whether that's OLED, FALD LCD, or Edge-lit LCD.
So that $45/month reduction in your power bill is not quite so surprising then - especially if you were using the plasma as a PC monitor.

My PC is almost 7 years old. When I build it had 2 Geforce 470 in SLI- These where super overclocked based on the Nvidia Fermi arcitecture. These where loud and they got hot. They where the worst power/thermal performances in Nvidias history, and it was after this generation they went into power efficency!
I think the big change that shows how far we've come is that the 970 I have in it now, is so cool, so silent and it draws little power. And the CPU - i7 4790 also draws a lot power than the old i7 950. So in these 7 years I just stuck with the same Corsair 1000Watt power supply.
You might want to think about replacing it. Depending on the model, a 7 year warranty was common for the higher-end Corsair power supplies back then.
The power supply is something that I would always replace once the warranty period is up, because you don't want it to blow and potentially take out other components with it.

When I got my 60 inch LG PK950 Plasma in 2010 it was on the dawn of the first consumer 3D TVs. Avatar had come out and the new models was going to have 3D. So I thought I would buy this top of the line LG Plasma- It was more affordable in the wake of the 3D TV releases. So I go to AVSforum- The guys who are the leading authority on TVs (?) and they tell me, and hundreds of others that image retention is barely a problem. All you have to do is go through the break in cycle the first 100 hours.
The problem is when you get a TV like this, the problems don't show in the beginning, and sometimes shipping a product like this (or a dishwasher, or a fridge) is just going to be extremely expensive.
I've been happy with the performance of my TV. The colors and contrast where amazing. Compared to the faded pastal colored tinted bitch LCD displays of 2010 this one was a winner. But fuck me, has it been annoying and stressful paying 4000 USD for a product that needed babying, that had this energy consumption and expelled this clicking noise because it was overheating (a common problem on these).

I want a big ass screen, but I want reliability. If some of the manufactureres would test the TVs for the first 100-200 hours for a break-in phase that would make me more assured. The other thing- if you're playing games or using it as a monitor retention is going to be a problem.
My image retention is permanent. It might not be burn in, but those logos never went away. it's very very faint but it still bums me out.
By most accounts that I've seen from owners, image retention is very transient on these OLEDs, and the worst of it is usually fixed by leaving the TV on standby overnight to run a compensation cycle.
However it does happen - even on the 2016 models, and especially with HDR gaming since that pushes the brightness much higher while there are static elements on-screen.
Outside of commercial usage, I've not heard of an LG OLED suffering from permanent burn-in or long-term image retention.
So while they may not be as prone to it as the plasmas that many owners also claimed never got image retention/burn-in, I would still be hesitant to buy one based on my intended use, which would mainly be as a PC monitor for the desktop and games, for many hours a day.
I'm not saying that anyone else shouldn't buy one, but they should make an informed decision and people shouldn't be posting here saying that image retention can't happen - just as the plasma evangelists were doing 5 years ago.
I'd still rather have a display where I don't have to even think about it, where I'm not going to be so tired one night that I fall asleep in front of the TV and possibly wake up to a display with long-term or even permanent damage.

I'm just really excited to see what LG, Samsung, Panasonic, and Sony are going to have on display at CES.
They're each going to have something different on display and that's very exciting.
Sony's OLED is probably going to be the least exciting from a "tech" perspective since they'll be using an LG panel, but it may end up being the display that I actually buy if they get the image processing and features right.

The main sticking point with OLED for me is still the motion handling.
While I would prefer not to have the worry of image retention/burn-in, if someone fixes the motion handling or adds G-Sync support (since the two are incompatible) that's probably enough for me to buy one.
 

LeleSocho

Banned
I remember when Sony were the pioneers for OLEDs god knows how many years ago with that fanboy of Kojima even putting a Sony OLED screen in the Metal Gear Mk.2 in MGS4 and now they just buy panels from LG for their lineup of TVs lol

It starts looking like an agenda when certain posters mention burn-in nearly every time the post i the thread.

And I haven't seen anyone pitch OLED as perfect. A perfect display has never existed, not even close.

I meant perfect in reliability factor not perfect all around.

As for the "agenda" thing i don't know what else to say aside that is asinine as in there's absolutely no benefit for anyone on people falsely complaining about OLEDs...
 
In general, people who complain about "agendas" are the ones who actually have one. Normal people just want to buy the best product for their own individual needs. Discussion of any product rightly includes discussion of benefits AND flaws.
 

Lady Gaia

Member
LCDs have typically not had any form of brightness limiter in place, in part due to their efficiency.
The new FALD HDR LCDs do now though, due to the higher brightness that HDR is pushing.
Samsung's KS9000 LCD has a peak brightness of 1500 nits and that drops to about 500 nits for full-screen white.

The KS9000 isn't FALD, though I suspect the same principle applies to the edge lit 8000/9000 which can do local dimming in vertical stripes. Samsung' FALD set in this lineup is the KS9800 (I think that's the KS9500 in Europe?) and I'm not sure what the limits are there. I ignored the set since there's no flat counterpart.
 

BumRush

Member
In general, people who complain about "agendas" are the ones who actually have one. Normal people just want to buy the best product for their own individual needs. Discussion of any product rightly includes discussion of benefits AND flaws.

I do agree with you. But, there is no evidence supporting some of the claims in this thread that you need to baby OLEDs or they will get severe burn in.

It goes both ways. The flaws absolutely should be pointed out...and OLEDs certainly have flaws, but unsubstantiated posts around 2016 OLEDs and burn in aren't the way to do it (not saying you particularly, but any posts regarding this)
 

LeleSocho

Banned
The KS9000 isn't FALD, though I suspect the same principle applies to the edge lit 8000/9000 which can do local dimming in vertical stripes. Samsung' FALD set in this lineup is the KS9800 (I think that's the KS9500 in Europe?) and I'm not sure what the limits are there. I ignored the set since there's no flat counterpart.

I still don't understand why Samsung hasn't bothered making a flat counterpart of the KS9500 since that obviously their true flagship and not the KS9000... people are actually still buying curved monitors?
 

The Beard

Member
I still don't understand why Samsung hasn't bothered making a flat counterpart of the KS9500 since that obviously their true flagship and not the KS9000... people are actually still buying curved monitors?

Yeah, I never got the excitement around those. Maybe it's because I actually care about Picture Quality and don't give a shit about gimmicks?

I've recently seen two curved TVs in people's homes (I think they bought them around the time they first came out) and they're terrible. They pick up glares from all angles. The one lady had a lamp by her chair and I could see the lamp's reflection/glare from 180°. So terrible.
 

Paragon

Member
The KS9000 isn't FALD, though I suspect the same principle applies to the edge lit 8000/9000 which can do local dimming in vertical stripes. Samsung' FALD set in this lineup is the KS9800 (I think that's the KS9500 in Europe?) and I'm not sure what the limits are there. I ignored the set since there's no flat counterpart.
Oh I thought the 9500/9000 and 8500/8000 were just the curved/flat versions of each model.
I hadn't really looked into it in much depth to be honest as I was not planning on buying one.
Rtings says that both are edge-lit local dimming rather than full array.
I think I just assumed that all the high peak brightness HDR LCDs were using FALD backlighting, while all the others at <500 nits were the edge-lit sets.
The point still stands though, it's only now that LCDs are trying to push beyond 1000 nits and having to meet certain power requirements to do so that they are also implementing brightness limiters - they're just set much higher because their designs are more efficient.
 

Lady Gaia

Member
Oh I thought the 9500/9000 and 8500/8000 were just the curved/flat versions of each model.

It seems to vary by region. In the US the scheme is as you described with the 9800 being the odd one out, both curved and full array local dimming. In Europe the corresponding display is designated KS9500, and the equivalent of the US KS8000 is a 7000 series display. Go figure.
 

Weevilone

Member
I do agree with you. But, there is no evidence supporting some of the claims in this thread that you need to baby OLEDs or they will get severe burn in.

It goes both ways. The flaws absolutely should be pointed out...and OLEDs certainly have flaws, but unsubstantiated posts around 2016 OLEDs and burn in aren't the way to do it (not saying you particularly, but any posts regarding this)

That's where I'm at with it. Let's talk about the actual flaws if we're going to do it. I worried about burn-in with my RPCRT. I worried about it again with my plasma. In both cases it was just needless. I never had any problem and I just used them, didn't baby them.

The prevalent logic now seems to be that "plasma people" said don't worry but some users had a bad experience with Panasonic units after that. The first thing to remember is that people got confident because Pioneer nailed it. And since Panasonic absorbed their intellectual property if memory servers, there was a legitimate reason to believe that would carry thru the Panasonic models. Apparently it didn't.

Second thing to consider is that we're talking about a different technology and different companies altogether. I'm not sure why some are so quick to try drawing parallels between them.

If you're scared about it, then stay away. What's happening in the thread is that people who actually own and use them daily aren't experiencing issues. I'm not sure why that doesn't count for something. Regardless, I won't post about it further. It doesn't really matter.
 

holygeesus

Banned
In general, people who complain about "agendas" are the ones who actually have one. Normal people just want to buy the best product for their own individual needs. Discussion of any product rightly includes discussion of benefits AND flaws.

Well quite, but you have people in here, mentioning no names, who have no hands-on experience with OLED, ignore current improvements in the tech, and continue to spread false information, that was true perhaps ten years ago...

Back on the topic of IR. The only time I have noticed it, when gaming, is playing The Witness in HDR. As soon as you solve a puzzle and continue, the box remains on-screen outlined for a few seconds. It's not a case of HDR always = IR as I have no repeat experience with TLOU, but it is there. I do have my brightness and contrast maxed out for HDR mode though, which won't help matters.
 

LeleSocho

Banned
Well quite, but you have people in here, mentioning no names, who have no hands-on experience with OLED, ignore current improvements in the tech, and continue to spread false information, that was true perhaps ten years ago...

Yours is the epitome of the post that i was talking about earlier.
"modern OLEDs are perfectly fine, problems were solved 10 years ago"
Every year is the year they definitively fixed the problems and if it's not the current year then wait for the next few months when they'll announce the new panels that surely won't have any issues.
 

holygeesus

Banned
Yours is the epitome of the post that i was talking about earlier.
"modern OLEDs are perfectly fine, problems were solved 10 years ago"
Every year is the year they definitively fixed the problems and if it's not the current year then wait for the next few months when they'll announce the new panels that surely won't have any issues.

Yeah you might want to point out where I said that all the flaws were ironed out ten years ago...especially when quoting a post of mine that highlights one of the current flaws of OLED otherwise you might make yourself look silly.
 
I have a Vizio 4k M series from 2015 . I have it hooked up to my PC. I noticed something weird in the Nvidia control panel the other day, hoping someone can enlighten me. So, when I go into the Nvidia panel to set resolution and refresh rate I can set 1080p 60hz, and 3840 x 2160 60hz. However, any resolution in between, say 1440p, I can only set a max of 30hz. I would think that if the set has the bandwidth for 4k60hz, it should be able to do 1440p60hz, no? Unless it needs some sort of frame buffer to upscale the image and 1440p is too big? I've tried forcing games into 1440p but I can never set the refresh more than 30hz. Anyone know what's going on here?
 
I have a Vizio 4k M series from 2015 . I have it hooked up to my PC. I noticed something weird in the Nvidia control panel the other day, hoping someone can enlighten me. So, when I go into the Nvidia panel to set resolution and refresh rate I can set 1080p 60hz, and 3840 x 2160 60hz. However, any resolution in between, say 1440p, I can only set a max of 30hz. I would think that if the set has the bandwidth for 4k60hz, it should be able to do 1440p60hz, no? Unless it needs some sort of frame buffer to upscale the image and 1440p is too big? I've tried forcing games into 1440p but I can never set the refresh more than 30hz. Anyone know what's going on here?

Sounds like some EDID issue meaning 1080p/4K are the only ones officially supported over HDMI(?). If you create a custom res that should allow you to do 1440p60
 
I have a Vizio 4k M series from 2015 . I have it hooked up to my PC. I noticed something weird in the Nvidia control panel the other day, hoping someone can enlighten me. So, when I go into the Nvidia panel to set resolution and refresh rate I can set 1080p 60hz, and 3840 x 2160 60hz. However, any resolution in between, say 1440p, I can only set a max of 30hz. I would think that if the set has the bandwidth for 4k60hz, it should be able to do 1440p60hz, no? Unless it needs some sort of frame buffer to upscale the image and 1440p is too big? I've tried forcing games into 1440p but I can never set the refresh more than 30hz. Anyone know what's going on here?

Iirc, only one of the HDMI ports on that TV supports 60hz. I want to say it's port 5.
 

j-wood

Member
I have a question about the KS8000 (I recently bought the 60in).

Is game mode necessary for gaming? I just purchased a new receiver (Yahmaha TSR-5810), and now that I have my ps4 pro and x1s going to it, I'll have to manually turn on and off game mode every time I play a game, since it's all on one input on the TV.
 

spannicus

Member
I have a question about the KS8000 (I recently bought the 60in).

Is game mode necessary for gaming? I just purchased a new receiver (Yahmaha TSR-5810), and now that I have my ps4 pro and x1s going to it, I'll have to manually turn on and off game mode every time I play a game, since it's all on one input on the TV.
I haven't turned on game mode since i bought the TV. Play Titanfall 2 almost daily no problem.
 

The Beard

Member
I have a question about the KS8000 (I recently bought the 60in).

Is game mode necessary for gaming? I just purchased a new receiver (Yahmaha TSR-5810), and now that I have my ps4 pro and x1s going to it, I'll have to manually turn on and off game mode every time I play a game, since it's all on one input on the TV.

I haven't turned on game mode since i bought the TV. Play Titanfall 2 almost daily no problem.

Input lag when not in Game Mode was measured at 113ms on the KS8000.
 
I have a question about the KS8000 (I recently bought the 60in).

Is game mode necessary for gaming? I just purchased a new receiver (Yahmaha TSR-5810), and now that I have my ps4 pro and x1s going to it, I'll have to manually turn on and off game mode every time I play a game, since it's all on one input on the TV.
That's the exact reason I bought a receiver with dual outputs. I have one hdmi going to a TV input calibrated for movies, and another going to an input calibrated for gaming.

Not all combinations work though, I had an older Denon receiver going to my 1080p Sony W900a and that had HDCP handshake issues when I tried dual outputs. My new Denon X2300w and my LG E6 combine perfectly though.
 

dragos495

Member
i bought a 65" lg c6 on black friday and i absolutely love it! i use it as a pc monitor and you would think i worry about IR, i dont! saying that, i did use that manual compensation cycle about every time i turned off the tv, the one in oled panel settings>clear panel noise> start once tv is turned off. the tv is on for about 10-12h every day with browsing, netflix and gaming on a ps4 taking most of its on time.

and just know i learned that wasnt very smart and the tv actually does that every for every 4h of working time. what can i say, RTFM wasnt on my list... :(

anyway, after all that, on some worse case scenario images and movie clips with all kind of grayscales and black percentages, i can only notice some minor banding and some vigneting. i dont know if it was worse before all those cycles or if i made it worse (doubt it) but now i know just to leave it alone.

one thing that bothers me is that i dont get all that input info in the corner when i change inputs, i only get hdmi 1/2/3. even if i clikc on it with the pink pointer the info only stays for a few seconds and the goes away every time i change inputs.

oh and the abl but i can live with that.
 
I have an OLED E6 and I can tell you it is beautiful. What pisses me off is there is no content to play it with. 4k gaming struggles for 60 fps so looks like shit anyway, and there are NO 4k drives on the market. I believe there are some digital players that cost $200, but most digital stuff, if not all, is just upscaled. There is just no content.

Still love the TV though. Upscaled content looks awesome, and the colors and black levels make it worth it alone.
 
I have an OLED E6 and I can tell you it is beautiful. What pisses me off is there is no content to play it with. 4k gaming struggles for 60 fps so looks like shit anyway, and there are NO 4k drives on the market. I believe there are some digital players that cost $200, but most digital stuff, if not all, is just upscaled. There is just no content.
Unless you own a 4K UHD player. :p
 

Kyoufu

Member
I have an OLED E6 and I can tell you it is beautiful. What pisses me off is there is no content to play it with. 4k gaming struggles for 60 fps so looks like shit anyway, and there are NO 4k drives on the market. I believe there are some digital players that cost $200, but most digital stuff, if not all, is just upscaled. There is just no content.

I'm confused :(
 

BumRush

Member
I have an OLED E6 and I can tell you it is beautiful. What pisses me off is there is no content to play it with. 4k gaming struggles for 60 fps so looks like shit anyway, and there are NO 4k drives on the market. I believe there are some digital players that cost $200, but most digital stuff, if not all, is just upscaled. There is just no content.

Still love the TV though. Upscaled content looks awesome, and the colors and black levels make it worth it alone.

Whatyearisit.jpg
 

Paragon

Member
Yup, same result. The custom res test doesn't output a signal to the TV.
Are you sure that you're creating the custom resolution correctly?
I'm only hooked up to a 1080p TV right now, but the process is to:
  1. Click "Create Custom Resolution"
  2. This should default to your current resolution, change the timing from automatic to manual to lock the timings in place.
  3. Set your custom resolution in the "Display Mode" section. In this example, the TV receives a 1080p signal but the PC is rendering 960x540.
 

MazeHaze

Banned
I have a question about the KS8000 (I recently bought the 60in).

Is game mode necessary for gaming? I just purchased a new receiver (Yahmaha TSR-5810), and now that I have my ps4 pro and x1s going to it, I'll have to manually turn on and off game mode every time I play a game, since it's all on one input on the TV.

People sayiny they don't use game mode, I have no idea how anyone can deal with over 100 ms of input lag. I accidentally had it out of game mode once when playing Alien Isolation and I noticed immediately. I could never play a game like that, even one that isn't reflex based. There is a noticeable delay between pressing the right stick and the camera moving.

Seriously if you aren't using game mode, give it a try, I think you'll really appreciate the difference. Over 100 ms is INSANE to me.
 

Zoe

Member
So after seeing a friend's 65" 4K OLED LG, I have to say in a true believer. One thing was bothering me though: Showtime looked like shit on his TV. The premium channels look better than standard cable on my puny 1080p 49", so I dread to think of what they must look like on his TV.

It's this going to be a problem with all 4K TV's?
 
Top Bottom