• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

I appreciate these burn in tests, but I'm not the slightest bit concerned. I'm on my 3rd OLED tv, mainly used for games/pc, and haven't had a single issue.

The tech is solid as far as I'm concerned. It's the best image anyone could ever get. Simply astounding.
 
Will ask again: Have a question about HDMI. If I put the Input to the enhanced format, and all sources come from the AVR 1080i, 1080p, 4k, HDR etc. is there any downside of this?

If not, why aren't all HDMI inputs on enhanced by default.


Do you with an AVR have now more cables connected to the TV? Separate for 4k media?
 
Although it sounds like you have made your decision I recommend you read my post from a few days ago on this thread. I did a full run down on the differences as I have owned both.

And in a nutshell, the 930 has the 900 beat in almost every way. And in my opinion worth the price hike.

I'll take a look, thanks!
 

holygeesus

Banned
A stress test is a means of replicating long term usage - in the short term - so that you can give people reasonable answers on what might be concerns to look out for before they actually become concerns.

It isn't though, at least not in this scenario, as you are unduly stressing technology that wouldn't be replicated by putting in normal hours of usage as a person would generally use their sets. We already know the lifespan of OLED technology so this seems to be a test that recreates artificial situations that will not be replicated in any other situation.
 
Just posting my post from the LG thread that shows how to fix HDR Game Mode by changing the input icon from a poster at AVS forum. It's useful for 2017 OLEDs that can't downgrade the firmware and may have already released with the one that has dim HDR Game Mode.
http://m.neogaf.com/showpost.php?p=250566824

Fuck LG.


Unfortunately HDR does not work properly in PC mode so that is not a viable fix. The colours do not display correctly and the options needed to correct it are greyed out.


Also talked about in the forums at avs.
 

The Lamp

Member
Unfortunately HDR does not work properly in PC mode so that is not a viable fix. The colours do not display correctly and the options needed to correct it are greyed out.


Also talked about in the forums at avs.

On none of the modes?? Even adjusting the basic color sliders don't help? :(
 
I haven't seen any gradient banding in HDR PC on my PC. Is it just a PS4 thing? Also I saw somebody post that if you change output to 4:2:0 on ps4 pro, the banding issue goes away.

Unfortunately HDR does not work properly in PC mode so that is not a viable fix. The colours do not display correctly and the options needed to correct it are greyed out.


Also talked about in the forums at avs.

On none of the modes?? Even adjusting the basic color sliders don't help? :(
Apparent solution on previous page, guys

If that's your issue, anyway
 

BumRush

Member
I'd rather they spend their time investigating why some people are getting apparent burn-in under normal use (such as yourself right?). Stress-tests just seem like a waste of energy, literally.

After reading some posts here and reading the rtings test in greater detail, I believe the true intent of it is just to replicate long term usage in a few months time. It's not perfect, but it's effectively the only way to test newer sets since they just came out.
 

tokkun

Member
Ok it's about as much use to a consumer as those drop-tests for phones. Nobody uses a TV like they do in their test, unless they are shut-ins who sleep for 2 hours a day and just watch the same looping video over and over, so what is the point?

This type of testing is very common within industry. It's called a HALT test: https://en.wikipedia.org/wiki/Highly_accelerated_life_test

I would be very surprised if LG does not already do something like this internally, because it is fairly standard practice in electronics to test products in this way.

Now, it is certainly possible that such tests can be non-representative if they capture a failure mode that could not exist normally. If the cause of burn-in is uneven wear on the pixels, then it really shouldn't matter whether you get to 500 hours in 4 weeks or in 4 years, right?

The results we are seeing in the Rtings test are being validated against the anecdotal evidence given by non-"shut-ins" who got burn-in through normal use. Based on the real-world experience of those users, people over in the AVSForum thread were successfully predicting in advance the time that we would start seeing clearly visible burn-in in the Rtings test. I find that to be a fairly compelling piece of evidence in favor of the representative nature of the Rtings test.
 
Need a bit of advice.

I got the TCL p605 recently, but had to do an exchange (screen wouldn't turn on). Received the replacement set yesterday and was fine at first, but then a few hours in, it developed a thin horizontal line of dead pixels across the middle. I'm going to return this set, and will probably choose a different TV.

So here's my question, should I bump up my budget and get the Sony X900 or are there other comparable TVs in the $600-$800 price range? I currently have vanilla PS4 and Xbox One, but plan to upgrade to Pro and the One X at the end of this year so the TV will see a lot of gaming. There's also the size issue where my wife thinks 55" is too large and the X900 has a 49" available.
 

holygeesus

Banned
After reading some posts here and reading the rtings test in greater detail, I believe the true intent of it is just to replicate long term usage in a few months time. It's not perfect, but it's effectively the only way to test newer sets since they just came out.

What do you consider long term usage though? OLED as a technology has been used on a consumer level for years and years now and it is acknowledged that pixel deterioration is real, and is being improved upon generation to generation. The issue of blue pixel deterioration being the latest problem to be solved.

My point is, why the need to brute force the same image over and over again? Nobody uses their set in this way over it's lifespan. If running a set for 22 hours with random, mixed coverage was being done, I could see some validity to it, but you can't legitimately advance testing using the method rtings are utilising, as it doesn't equate to how a set is treated in the home and is therefore pointless. We know that pixels deteriorate, and at a different rate if you show the same image constantly, so this isn't news.

When it starts happening during regular use, as *is* happening as these sets become more common-place, then sure, investigate why, but rtings test tells us nothing we don't know already about the technology i.e. pixel deterioration is real, and will never be eliminated, just as plasma panels have an end-life and just liked LCD back-lights have a lifespan before they go pop.
 

PerkeyMan

Member
So, is there any "concensus" about waiting for HDMI 2.1 or not if you are planning on buying a new TV in long term? My Pioneer Kuro has served me well but it's time to move on :')
 

Kyoufu

Member
So, is there any "concensus" about waiting for HDMI 2.1 or not if you are planning on buying a new TV in long term? My Pioneer Kuro has served me well but it's time to move on :')

Yeah, I wouldn't buy a HDMI 2.0 TV when we're a handful of months away from CES 2018.
 

Guerrilla

Member
Looks like the 75" x900e and x940e are my best two options, anybody knowshow these are called in europe?

edit: nvm these seem to be quite a bit more expensive here...

Does anybody have any experience with the PHILIPS 75PUS7101/12?
 

Mrbob

Member
So is this confirmed HDR banding is fixed on 2017 oled with 420 on pc input?

I might update then. PC mode is great for SDR content gaming, I just don't use it for hdr because of the banding.

Sitting on 3.5.xx firmware but need some sort of confirmation.
 

spwolf

Member
Need a bit of advice.

I got the TCL p605 recently, but had to do an exchange (screen wouldn't turn on). Received the replacement set yesterday and was fine at first, but then a few hours in, it developed a thin horizontal line of dead pixels across the middle. I'm going to return this set, and will probably choose a different TV.

So here's my question, should I bump up my budget and get the Sony X900 or are there other comparable TVs in the $600-$800 price range? I currently have vanilla PS4 and Xbox One, but plan to upgrade to Pro and the One X at the end of this year so the TV will see a lot of gaming. There's also the size issue where my wife thinks 55" is too large and the X900 has a 49" available.

Xe900 is a good TV and 55 can't be too big... Always get bigger one, you get used to the size quickly and then you will regret smaller TV purchase.
 

Mrbob

Member
I can second that. Never go down on size unless you just don't have the space. Your eyes we adjust quickly to the bigger size.
 

Mrbob

Member
I just retested HDR with pc input on ratchet and clank
PS4 with firmware 3.5.xx . On standard HDR there are serious washed out colors in pc mode. Looks so good too in standard HDR in regular HDMI input but input lag is horrible.
 

Weevilone

Member
Ok, thanx! I'll wait :)

I'm wondering if we'll see TV manufacturers go the way of some AVR manufacturers and start advertising the individual supported features rather than proclaim support for the whole HDMI standard. I bet we don't see all of 2.1 rolled into one model year, but across several.
 

BumRush

Member
What do you consider long term usage though? OLED as a technology has been used on a consumer level for years and years now and it is acknowledged that pixel deterioration is real, and is being improved upon generation to generation. The issue of blue pixel deterioration being the latest problem to be solved.

My point is, why the need to brute force the same image over and over again? Nobody uses their set in this way over it's lifespan. If running a set for 22 hours with random, mixed coverage was being done, I could see some validity to it, but you can't legitimately advance testing using the method rtings are utilising, as it doesn't equate to how a set is treated in the home and is therefore pointless. We know that pixels deteriorate, and at a different rate if you show the same image constantly, so this isn't news.

When it starts happening during regular use, as *is* happening as these sets become more common-place, then sure, investigate why, but rtings test tells us nothing we don't know already about the technology i.e. pixel deterioration is real, and will never be eliminated, just as plasma panels have an end-life and just liked LCD back-lights have a lifespan before they go pop.


From RTINGS:

"A 5.5 hour video loop is used as the test pattern. It has been designed to mix static content with moving images to represent some typical content. The base material is a recording of over the air antenna TV with RTINGS overlay logos of different opacities and durations, and letterbox black bars added. These additional elements are:

-Top and bottom: Letterbox bars present for 2 hours, then absent for 3.5 hours (movie example)
-Top left: 100% solid logo, present for the whole clip (torture test)
-Top right: 50% opacity logo, present for the whole clip (network logo torture test)
Bottom left: 100% solid logo, present for 2 hours then absent for 3.5 hours (video games example)
-Bottom right: 50% opacity logo, present for 10 minutes then absent for 2 minutes (sports or TV shows example)"

With the exception of top left (and maybe top right?) the rest of the test is somewhat close to a feasible use case. Heavy usage? Yes. But a family of 4-5+ might have the TV on 10+ hours a day, with similar content on most of the time.
 

tokkun

Member
What do you consider long term usage though? OLED as a technology has been used on a consumer level for years and years now and it is acknowledged that pixel deterioration is real, and is being improved upon generation to generation. The issue of blue pixel deterioration being the latest problem to be solved.

My point is, why the need to brute force the same image over and over again? Nobody uses their set in this way over it's lifespan. If running a set for 22 hours with random, mixed coverage was being done, I could see some validity to it, but you can't legitimately advance testing using the method rtings are utilising, as it doesn't equate to how a set is treated in the home and is therefore pointless. We know that pixels deteriorate, and at a different rate if you show the same image constantly, so this isn't news.

A couple months ago if you claimed that you got burn-in on an OLED, you were called a liar or a Samsung shill. People may have known that pixel degradation was possible, but in no way was there any sort of widespread acceptance that it could actually cause burn-in. There were many people going around and saying that burn-in was impossible due to pixel shifting and compensation cycles. People said this to me in this very thread not that long ago. So claiming that it is not news to show that you can get burn-in in a few hundred hours seems like some serious revisionist history to me.

Beyond proving the possibility of burn-in, there is a lot of value in getting a ballpark estimate on how many hours we can safely display a static portion of an image - be it a game HUD or the Breaking News ticker at the bottom of a cable channel - before we can expect to see visible burn-in. This can tell you whether your usage patterns are likely to be a problem.

When it starts happening during regular use, as *is* happening as these sets become more common-place, then sure, investigate why, but rtings test tells us nothing we don't know already about the technology i.e. pixel deterioration is real, and will never be eliminated, just as plasma panels have an end-life and just liked LCD back-lights have a lifespan before they go pop.

That is exactly what is happening. People were getting burn-in from normal use, provided you consider watching a couple hours of cable new per day to be "normal". That is what triggered this investigation. And thanks to the Rtings test, we now understand it much better. If 500 hours of cumulative watch time with a static image on screen causes visible burn-in, then watching CNN or MSNBC for 2 hours a day for a year will give you burn-in.
 
Xe900 is a good TV and 55 can't be too big... Always get bigger one, you get used to the size quickly and then you will regret smaller TV purchase.

I can second that. Never go down on size unless you just don't have the space. Your eyes we adjust quickly to the bigger size.

Third.

I'm already adjusted to my 65'' coming from 55''. Sitting here like how did I use 55'' for so long.

Okay, thanks! Any thoughts on other TVs in the $600-$800 price range or is the X900 the best option?
 

Mrbob

Member
Yeah I think those rting burn in tests are fine. My only issue is testing only the 2016 oled doesn't really help me in how the 2017 model performs. That test is really only good for 2016 LG oled owners. Wish they would have included a 2017 model in their testing as I'd like to see if the 2017 models are showcasing the same issue.
 
Okay, thanks! Any thoughts on other TVs in the $600-$800 price range or is the X900 the best option?

TCL still the best bang for the buck. Anything else that annoyed you about the TCL or is it just your bad luck with it so far?

The X900 is a fine alternative, but I wouldn't buy it if I had the option to get the TCL.
 

BumRush

Member
Yeah I think those rting burn in tests are fine. My only issue is testing only the 2016 oled doesn't really help me in how the 2017 model performs. That test is really only good for 2016 LG oled owners. Wish they would have included a 2017 model in their testing as I'd like to see if the 2017 models are showcasing the same issue.

Agreed.

They buy all of their own sets though and they test a TV for one year (normal review procedure). They mentioned that funds were limited and they couldn't afford to grab another C7 just to test for burn in, which makes sense.
 

Fredrik

Member
I bought the Sony 65" xe9305/xe93 about a week ago, found it with a $500 price drop (yay!). And it's insanely bright, like blindingly bright, too bright to be honest! x.x
Picture quality is just superb though (when the brightness is turned down), I don't have any 4k consoles yet but I've been using the new Apple TV 4K and while it probably isn't as good as an OLED I'm really pleased with how it looks so far.
I tried to calibrate it using some screen caps of the calibration on rtings and it became a lot better than the factory settings which were too high on contrast and color imo. There is some clouding going on if the picture and room is pitch black but I usually have a lamp on so it doesn't bother me much.
Overall I'm super satisfied! Highly recommended! :)
 
I bought the Sony 65" xe93 about a week ago, found it with a $500 price drop (yay!). And it's insanely bright, like blindingly bright, too bright to be honest! x.x
Picture quality is just superb though (when the brightness is turned down), I don't have any 4k consoles yet but I've been using the new Apple TV 4K and while it probably isn't as good as an OLED I'm really pleased with how it looks so far.
I tried to calibrate it using some screen caps of the calibration on rtings and it became a lot better than the factory settings which were too high on contrast and color imo. There is some clouding going on if the picture and room is pitch black but I usually have a lamp on so it doesn't bother me much.
Overall I'm super satisfied! Highly recommended! :)
Congrats! If next year's sets turn out disappointing, I'm getting this one for sure.
 

FLEABttn

Banned
Ordered a C7, selling my C6. Now to really get on LGs nerves about that Game Mode.

To be honest, I find the HDR Game Mode on the C7 to be perfectly fine in terms of brightness.

That also said, in SDR mode I keep the OLED light setting to 16 for a nice plasma type glow so maybe take my opinion with a grain of salt.
 

EvB

Member
I swear that Xbox One has had some update to handle HDR better, all my games look better.

Also BF1 is the best showcase I’ve seen for the technology
 
I bought the Sony 65" xe9305/xe93 about a week ago, found it with a $500 price drop (yay!). And it's insanely bright, like blindingly bright, too bright to be honest! x.x
Picture quality is just superb though (when the brightness is turned down), I don't have any 4k consoles yet but I've been using the new Apple TV 4K and while it probably isn't as good as an OLED I'm really pleased with how it looks so far.
I tried to calibrate it using some screen caps of the calibration on rtings and it became a lot better than the factory settings which were too high on contrast and color imo. There is some clouding going on if the picture and room is pitch black but I usually have a lamp on so it doesn't bother me much.
Overall I'm super satisfied! Highly recommended! :)

Don't have mine yet, but found this https://www.avforums.com/threads/sony-bravia-kd-55xe9305-hdr-4k-tv-review-comments.2095999/page-4

With BFI you will need the brightness...this vid. https://youtu.be/j6m_HrdCPks explains it too (but it's in german)

Maybe worth a try...
 

Alfredo_V

Neo Member
Hmm. Can I suggest that you connect your ps4 (pro?) to the TV just to test out? Try the following settings:

PS4 Video Settings

RGB Range: Limited
Everything else on Auto

TV Settings

Input Mode : Game Console (to open this menu, hold and press the button between Netflix and Amazon on remote)
Picture Mode : HDR Game
Color : 48
Color Temperature : W46
Dynamic Contrast : Low
Color Gamut : Extended
Black Level : Low

General -> HDMI DEEP ULTRA -> On

If this does not work I don't know mate

EDIT:

Actually one more question, what is your viewing environment like? What is the ambient like light?

Reason I ask is most mastering is done in a low light/no light environment. Try the above settings with no lights on/little ambient light. Unfortunately unlike SDR, there is no way to artificially increase the brightness of HDR content (dynamic contrast but yuck). OLEDS already struggle to replicate HDR content mastered beyond 1000 nits. That 1000 nits will look completely different in a dark room vs a well lit room. Until they have HDR modes (like HLG) that take into account ambient light, its best to mimic the viewing environment that the content was mastered in.

I did test out your suggested settings, I found the results to be very similiar. But I think you're on point regarding ambient light. I tend to game in a quite light room, and of course theres difference when I game in a completely dark room, but just small differences.

I'm very certain my problem is due to firmware, like others also reported. Thanks for your suggestions though!
 

Kambing

Member
4k@120 is a given, I'd say. Why couldn't vrr be a thing before Navi? Freesync 2 is a shoe-in if they want it to be, right?

I expect LG to implement more burn-in countermeasures, as well

Man i really hope VRR is a thing before Navi. I suppose i'm just pessimistic lol. In my mind, it's going to take a catalyst or big launch to make the implementation widespread. I see AMD waiting till Navi so they can tout it as a feature during launch, as well as work with TV vendors on implementing the tech. For Sony, i see them implementing VRR in their TV's when PS5 launches, cross promote. The X1X supports VRR through free sync but that will only be on monitors currently unless they get TV guys on board? I'd love to be wrong!

My hope is that Nvidia support VRR through HDMI 2.1
 
Top Bottom