• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

ACH1LL3US

Member
Yup. I wouldn't advise messing around too much in there though. If you look on the AVSForum there is a guide.


You can also get the app on the android store for the service menu, I use it on all my lg oled's. I would only disable the ABSL which is under the oled menu option in the service menu. Nothing else should be messed with.

Interesting story, the 2017's have under the oled menu anew option called APL, or average picture level, it is set to 48 but you can change it, from what I can tell it changes the ABL on the set.
 

holygeesus

Banned
Interesting story, the 2017's have under the oled menu anew option called APL, or average picture level, it is set to 48 but you can change it, from what I can tell it changes the ABL on the set.

I'm not sure I'd be messing with ABL. Not that I have any idea whether it will damage the set, but I'd expect your energy bills to sky-rocket.
 

NYR

Member
I bought one to disable the auto-dimming feature that kicks in when a static image is on screen for a set time, as I found it was kicking in sometimes, during dark movies, when it shouldn't. I just bought one of those cheap all in one remotes from Argos. Worked great.

Huh. I love that feature. I haven't had it kick in during live tho, just when I pause.
 

Kyoufu

Member
why would you disable a feature intended to prevent image retention?

Someone earlier in this thread did that and then left his TV on a static image in the YouTube app and came back home to a nasty surprise. :3
 

holygeesus

Banned
why would you disable a feature intended to prevent image retention?

Someone earlier in this thread did that and then left his TV on a static image in the YouTube app and came back home to a nasty surprise. :3

I never leave a static image on and even if I do, I have every faith that it wouldn't lead to significant IR. If I don't get it with HUDs after hours of gaming, I won't get it when I nip to the loo and pause a movie.
 

gnexus

Member
Just pulled the trigger on a 65" E6 OLED. I wanted to purchase something soon, and the 2017 models were just too rich for my blood (even though I'm spending a pretty penny already). Plus, I still enjoy 3D movies from time to time. It'll be an upgrade for my VT50 plasma, so it's been a few years since I've gotten a new TV. I'm pretty excited.
 

ACH1LL3US

Member
why would you disable a feature intended to prevent image retention?

Someone earlier in this thread did that and then left his TV on a static image in the YouTube app and came back home to a nasty surprise. :3


That was me.. lol

I did it twice but the good news is the b6 erased all image retention so no worries :)
 
I have a dumb question. Why do all calibration suggestions for pretty much all TVs suggest warm colors? Most of the time I see less detail when all colors are washed out and whites are yellowish... I prefer something neutral. Is this just a preference thing or am I missing out?
 

NYR

Member
I have a dumb question. Why do all calibration suggestions for pretty much all TVs suggest warm colors? Most of the time I see less detail when all colors are washed out and whites are yellowish... I prefer something neutral. Is this just a preference thing or am I missing out?
Not a dumb question. Warm is supposed to produce the most accurate color representation from an ISF and calibration methodology. You are supposed to set it to warm2 and then calibrate to get the "right" level of colours and details. The other most common settings "blow out" colours which look attractive to the eye at first but may not be accurate (eg - whites have a blue tint).

While I appreciate the reasoning behind the push to use the warm setting, I refuse to use it. I like cool settings, they just look better to me, I hate the piss filter when it comes to warm settings, I watch a lot of hockey and the rink is not white no matter what I do and does not look like when I am at the game in person. My cool or medium setting makes it look good to me. Trust me - do what makes it look good for you, as long as your other settings are reasonable, you'll be fine, there will be many who will convince you otherwise, but do what looks good to you and don't feel bad for using cool.
 

GeoNeo

I disagree.
I have a dumb question. Why do all calibration suggestions for pretty much all TVs suggest warm colors? Most of the time I see less detail when all colors are washed out and whites are yellowish... I prefer something neutral. Is this just a preference thing or am I missing out?

Blue tint to white is not neutral.

However, most preset "warm" (6500K) presets in TV's are way off and push too much red. The only way to properly calibrate is by using a spectrophotometer.

It's not hard to do at all and the best investment I ever made was buying a i1Display Pro worked amazing on all my high end displays (Plasma, & OLED).

Lastly, people that just copy settings they find on forums most of the time don't even understand panels very from each other greatly so if anything they might end up with a less accurate final image.

Edit: Properly calibrated color temperature should never make whites look horrible. If anything once you adjust and go back to color temperatures that push blue you'll find it very annoying.
 

Madness

Member
I have a dumb question. Why do all calibration suggestions for pretty much all TVs suggest warm colors? Most of the time I see less detail when all colors are washed out and whites are yellowish... I prefer something neutral. Is this just a preference thing or am I missing out?

It is not preference or missing out but universal standard of D65 reference aka 6500k white light (technically 6504k). You are seeing a color accurate picture, not one overly skewed to one color ie. More red or more blue, or an artificially enhanced image. You are seeing it how filmmakers often shoot and how theatres often present films. Only recently in the LCD and Plasma and HDTV era has it become skewed with artifical enhancements and vivid/dynamic becoming the norm. It is why when I have guests over I have to put on dynamic or sports mode or they complain that football looks dull or the ice looks yellow during the nhl playoffs.

Again though, unless you have the tools or properly calibrate yourself, you may also end up with inaccurate images as some colors are oversaturated and need correction. Panels also vary as does ambient room light. It is why you can use someone elses settings but should always tweak to your preference.
 

holygeesus

Banned
I have a dumb question. Why do all calibration suggestions for pretty much all TVs suggest warm colors? Most of the time I see less detail when all colors are washed out and whites are yellowish... I prefer something neutral. Is this just a preference thing or am I missing out?

I have my set calibrated to an accurate one and one I actually enjoy looking at, and flip between the two. I can tell you now that I completely and utterly prefer the one calibrated using Warm1 instead of the more accurate Warm2. Whites look white, not blue, and skin-tones are more accurate. I'd rather have an image I enjoy watching than one that someone else tells me is right.
 
So all new TVs from major companies are already released this year right? We can't expect new lines of TVs with HDMI 2.1 to be released this year?
 

scently

Member
So all new TVs from major companies are already released this year right? We can't expect new lines of TVs with HDMI 2.1 to be released this year?

Someone might but generally I don't think so. If you are interested in VRR, it is possible to that some TVs already out could be updated to include it as its not something that needs the bandwidth of 2.1 to function. Its something that can be added. Whether or not they do is a different matter altogether.
 
Someone might but generally I don't think so. If you are interested in VRR, it is possible to that some TVs already out could be updated to include it as its not something that needs the bandwidth of 2.1 to function. Its something that can be added. Whether or not they do is a different matter altogether.

VRR, Dolby Vision, HLG but especially HDR with dynamic metadata which needs HDMI 2.1 brandwidth.
 

GeoNeo

I disagree.
Biggest HDMI 2.1 features I'm looking forward to is VRR + bandwidth support for 4K @ 120Hz Full 4:4:4 Chroma. It will be killer on an OLED or even high end LCD. These high end panels already have native 120Hz 4K panels just held back by HDMI 2.0 bandwidth limitations.

I don't know if Sony will release a new Z9 end of this year, but if it is I would not be shocked if it was the first high end consumer UHD display to support HDMI 2.1.

2018 is gonna be really exciting. I'm personally going to be going with new 2018 OLED + Nvidia Volta PC setup. Actually, I would not be shocked if Sony showed off a consumer CLEDIS which I gladly pay fuck ton for lol.

Edit: Lastly, wonder if Samsung will get off their ass and show off a consumer replacement for LCD, since they are getting left behind in the high end space. Yet, we all know they have their own OLED display tech along with other self-emitting display technology.

I love where display tech is going few years back when Pioneer and Panasonic had pulled out of Plasmas it seemed we would be doomed to cheap edge lit LCD's for a very long time, however here we are with all the top manufacturers pretty much putting out killer display after killer display.
 
Biggest HDMI 2.1 features I'm looking forward to is VRR + bandwidth support for 4K @ 120Hz Full 4:4:4 Chroma. It will be killer on an OLED or even high end LCD. These high end panels already have native 120Hz 4K panels just held back by HDMI 2.0 bandwidth limitations.

I don't know if Sony will release a new Z9 end of this year, but if it is I would not be shocked if it was the first high end consumer UHD display to support HDMI 2.1.

2018 is gonna be really exciting. I'm personally going to be going with new 2018 OLED + Nvidia Volta PC setup. Actually, I would not be shocked if Sony showed off a consumer CLEDIS which I gladly pay fuck ton for lol.

I wonder if the PS4 PRO would support all of this since it is HDMI 2.0. The Scorpio has the advantage of releasing late this year and add hardware support for HDMI 2.1.
 

Theonik

Member
I don't think Sony is prepared to ship an HDMI 2.1 set this year. But they are well overdue a re-design of their board/processor setup. They were probably holding up for the spec changes before doing that since it's a significant expenditure. Hopefully they will try to eliminate the port split at the same time but we'll see.

e: As far as we know Scorpio's VRR implementation isn't compatible with HDMI 2.1's implementation and we have no indication MS is going to add HDMI 2.1.
This isn't to say this can't change but the spec had to be finalised before HDMI 2.1 and the changes that are needed would be substantial.
 

GeoNeo

I disagree.
I wonder if the PS4 PRO would support all of this since it is HDMI 2.0. The Scorpio has the advantage of releasing late this year and add hardware support for HDMI 2.1.

Well HDMI 2.0 could support VRR no doubt. When it comes to 4K @ 120Hz it's technically possible if they were to use say 4:1:1 chroma but I doubt we will see it. Even still 4K 60Hz support with VRR would be a great thing on these mid gen consoles.

I don't think Sony is prepared to ship an HDMI 2.1 set this year. But they are well overdue a re-design of their board/processor setup. They were probably holding up for the spec changes before doing that since it's a significant expenditure. Hopefully they will try to eliminate the port split at the same time but we'll see.

e: As far as we know Scorpio's VRR implementation isn't compatible with HDMI 2.1's implementation and we have no indication MS is going to add HDMI 2.1.
This isn't to say this can't change but the spec had to be finalised before HDMI 2.1 and the changes that are needed would be substantial.

Yeah the safe bet is Sony releasing it next year. Hell I even recall a Sony rep saying when the Z9 came out it would be their flagship product for 24 months..which lines up perfectly for the replacement coming out in 2018.

As for VRR on Scorpio HDMI 2.0 could easily support it there is no bandwidth limitations when it comes to that. So I'm sure whatever HDMI spec they go with these HDMI 2.1 TV's will have no issue supporting the VRR spec.

I doubt they would be dumb enough to make such claims without already doing pre-testing and making sure whatever implementation HDMI Forum is going with is what their new console supports.
 

Theonik

Member
I doubt they would be dumb enough to make such claims without already doing pre-testing and making sure whatever implementation HDMI Forum is going with is what their new console supports.
Microsoft never claimed to support HDMI 2.1 VRR. They are supporting AMD's Freesync over HDMI which is a different implementation which isn't supported in very many displays either.
 

GeoNeo

I disagree.
Microsoft never claimed to support HDMI 2.1 VRR. They are supporting AMD's Freesync over HDMI which is a different implementation which isn't supported in very many displays either.

Going by this Eurogamer article Scorpio indeed supports HDMI 2.1 VRR too.

http://www.eurogamer.net/articles/digitalfoundry-2017-project-scorpio-supports-freesync-and-hdmi-vrr

Last week, we published the hardware spec for Microsoft's next Xbox - Project Scorpio. However, there was one little detail we held back, an aspect of the new console we didn't want to get lost in the noise. In the here and now its applications will be limited, but in the fullness of time, it may help to bring about a profound shift in how displays interface with games hardware. To cut a long story short, Scorpio supports AMD's FreeSync - and the upcoming variable refresh rate support baked into the next-gen HDMI 2.1 spec.

Edit: And the Digital Foundry Video: https://www.youtube.com/watch?v=t18QbBdPK-8
 

GeoNeo

I disagree.
The HDMI 2.1 bit is speculation of DF's part. AMD's current Freesync over HDMI uses HDMI vendor extensions not the HDMI 2.1 VRR.

Hmm, that is kind of shitty of them to word the article like they did, since they were the ones that got hands on time with Scorpio.

Oh well not too long till E3 before we got more news from MS directly.
 
I have my set calibrated to an accurate one and one I actually enjoy looking at, and flip between the two. I can tell you now that I completely and utterly prefer the one calibrated using Warm1 instead of the more accurate Warm2. Whites look white, not blue, and skin-tones are more accurate. I'd rather have an image I enjoy watching than one that someone else tells me is right.

It is not preference or missing out but universal standard of D65 reference aka 6500k white light (technically 6504k). You are seeing a color accurate picture, not one overly skewed to one color ie. More red or more blue, or an artificially enhanced image. You are seeing it how filmmakers often shoot and how theatres often present films. Only recently in the LCD and Plasma and HDTV era has it become skewed with artifical enhancements and vivid/dynamic becoming the norm. It is why when I have guests over I have to put on dynamic or sports mode or they complain that football looks dull or the ice looks yellow during the nhl playoffs.

Again though, unless you have the tools or properly calibrate yourself, you may also end up with inaccurate images as some colors are oversaturated and need correction. Panels also vary as does ambient room light. It is why you can use someone elses settings but should always tweak to your preference.

Blue tint to white is not neutral.

However, most preset "warm" (6500K) presets in TV's are way off and push too much red. The only way to properly calibrate is by using a spectrophotometer.

It's not hard to do at all and the best investment I ever made was buying a i1Display Pro worked amazing on all my high end displays (Plasma, & OLED).

Lastly, people that just copy settings they find on forums most of the time don't even understand panels very from each other greatly so if anything they might end up with a less accurate final image.

Edit: Properly calibrated color temperature should never make whites look horrible. If anything once you adjust and go back to color temperatures that push blue you'll find it very annoying.

Not a dumb question. Warm is supposed to produce the most accurate color representation from an ISF and calibration methodology. You are supposed to set it to warm2 and then calibrate to get the "right" level of colours and details. The other most common settings "blow out" colours which look attractive to the eye at first but may not be accurate (eg - whites have a blue tint).

While I appreciate the reasoning behind the push to use the warm setting, I refuse to use it. I like cool settings, they just look better to me, I hate the piss filter when it comes to warm settings, I watch a lot of hockey and the rink is not white no matter what I do and does not look like when I am at the game in person. My cool or medium setting makes it look good to me. Trust me - do what makes it look good for you, as long as your other settings are reasonable, you'll be fine, there will be many who will convince you otherwise, but do what looks good to you and don't feel bad for using cool.

So it sounds that unless I know what I'm doing, sticking to colors that I think look good is the best option. No matter what I do I cannot get white to look not yellow with "warm" preset... On both of my sets (last year's Hisense and B7 OLED) standard picture with slight adjustments to contrast and color plus turning off most of the "smoothing" crap clearly gives the best picture (and whites in particular).

Also does anyone have examples of LG "super resolution" working? It doesn't seem to do much even on 720p signal...
 

tokkun

Member
Hmm, that is kind of shitty of them to word the article like they did, since they were the ones that got hands on time with Scorpio.

Oh well not too long till E3 before we got more news from MS directly.

The HDMI 2.1 spec has not been published yet and the compliance tests have not been released. Until that happens, no one can honestly say that any piece of hardware is HDMI 2.1-compatible. The best they can do is say they intend to make it compatible.
 

Yukstin

Member
Also does anyone have examples of LG "super resolution" working? It doesn't seem to do much even on 720p signal...

I haven't noticed a single difference with my C6 and using that feature and I checked it with a couple of different sources. I just turned it off.
 

GeoNeo

I disagree.
So it sounds that unless I know what I'm doing, sticking to colors that I think look good is the best option. No matter what I do I cannot get white to look not yellow with "warm" preset... On both of my sets (last year's Hisense and B7 OLED) standard picture with slight adjustments to contrast and color plus turning off most of the "smoothing" crap clearly gives the best picture (and whites in particular).

Also does anyone have examples of LG "super resolution" working? It doesn't seem to do much even on 720p signal...

So it sounds that unless I know what I'm doing, sticking to colors that I think look good is the best option. No matter what I do I cannot get white to look not yellow with "warm" preset... On both of my sets (last year's Hisense and B7 OLED) standard picture with slight adjustments to contrast and color plus turning off most of the "smoothing" crap clearly gives the best picture (and whites in particular).

Also does anyone have examples of LG "super resolution" working? It doesn't seem to do much even on 720p signal...

If you don't have access to a spectrophotometer try and rent one..I highly recommend it for a display like these new OLEDs.

Having a look at the rtings.com review of their C7 as an example http://www.rtings.com/tv/reviews/lg/c7 (Note: 2017 LG OLED's all use the same SOC & Panels but of course panels very from each other like normal)

You can see Pre Calibration settings on that set the white balance is off. Honestly if you can't properly calibrate your set with the required tools or pay for ISF Calibration just go with what looks best to you and enjoy your set.

The HDMI 2.1 spec has not been published yet and the compliance tests have not been released. Until that happens, no one can honestly say that any piece of hardware is HDMI 2.1-compatible. The best they can do is say they intend to make it compatible.

This is 100% true they plan to release the tests in Q2 (which started in April). However, I'm disappointed Digital Foundry are not more transparent with the information they got. Did someone at MS assure them or did they just simply speculate themselves. If it was just pure speculation by themselves it's very annoying the way they worded that article and video if it was someone from MS telling them I'm fine with what they reported.

Oh well E3 is few weeks away so we can confirm 100% what MS plan to support.
 

mrklaw

MrArseFace
It may only look yellow compared to what you're used to. Perhaps you could leave it for a few days and you'll adjust to it?
 
It may only look yellow compared to what you're used to. Perhaps you could leave it for a few days and you'll adjust to it?

I tried to leave it on for one day. It looks yellow compared to my iPhone screen, Vita and Surface. I know that those other devices are not really designed for best movie viewing experience, but let's be honest here, all of us look at them more often than we look at the TV :)
 

TheBoss1

Member
I tried to leave it on for one day. It looks yellow compared to my iPhone screen, Vita and Surface. I know that those other devices are not really designed for best movie viewing experience, but let's be honest here, all of us look at them more often than we look at the TV :)

One day is not enough. At least for me, it took me a few days because I was so programmed to using cool or neutral all my life. Now I can't go back to those settings.
 

GeoNeo

I disagree.
I tried to leave it on for one day. It looks yellow compared to my iPhone screen, Vita and Surface. I know that those other devices are not really designed for best movie viewing experience, but let's be honest here, all of us look at them more often than we look at the TV :)

All these other screens push blue. Vita is by far the worst offender of the bunch terrible out of box settings.

I know with my iPhone 7 Plus the white point is at 6802K out of the box which is amazing. I know with my older iPhone 6 the white point was 7250K. Both tested with i1Display Pro.

I even tested the new Samsung S8 and it allows you to set white point which is very close to 6800K.

Another issue with calibrating without tools a lot of the time gamma is way off so it gives the image a flat look. When you calibrate to proper gamma point the image really pops. I calibrate to BT. 1886 Gamma which is simply stunning on OLED & my Pioneer Kuro. :)
 
One day is not enough. At least for me, it took me a few days because I was so programmed to using cool or neutral all my life. Now I can't go back to those settings.

All these other screens push blue. Vita is by far the worst offender of the bunch terrible out of box settings.

I know with my iPhone 7 Plus the white point is at 6802K out of the box which is amazing. I know with my older iPhone 6 the white point was 7250K. Both tested with i1Display Pro.

I even tested the new Samsung S8 and it allows you to set white point which is very close to 6800K.

Another issue with calibrating without tools a lot of the time gamma is way off so it gives the image a flat look. When you calibrate to proper gamma point the image really pops. I calibrate to BT. 1886 Gamma which is simply stunning on OLED & my Pioneer Kuro. :)

I can't disagree with any of this, but my eyes are broken at this point. I wouldn't be surprised that real life looks different to me because of staring at electronics all day. :)

I'm exaggerating of course, but getting used to one screen looking different than others is almost impossible. I might try again at some point when I'm more confident in my calibration skills.
 

holygeesus

Banned
I can't disagree with any of this, but my eyes are broken at this point. I wouldn't be surprised that real life looks different to me because of staring at electronics all day. :)

I'm exaggerating of course, but getting used to one screen looking different than others is almost impossible. I might try again at some point when I'm more confident in my calibration skills.

Have you tried Warm1 instead of the default Warm2?
 
I tried to leave it on for one day. It looks yellow compared to my iPhone screen, Vita and Surface. I know that those other devices are not really designed for best movie viewing experience, but let's be honest here, all of us look at them more often than we look at the TV :)

One day didn't do it for me. Back in the early 2000s when I started to get into calibrations it took me a while to get used to D65. It did look "yellow" at first compared to the "neutral/normal" color temperature settings I used to enjoy back then. Anyhow, I've never owned a Vita, but all of my other mobile devices are calibrated to D65 (or as close as I can get to it). :p
 

GeoNeo

I disagree.
I can't disagree with any of this, but my eyes are broken at this point. I wouldn't be surprised that real life looks different to me because of staring at electronics all day. :)

I'm exaggerating of course, but getting used to one screen looking different than others is almost impossible. I might try again at some point when I'm more confident in my calibration skills.

You should be able to tweak the white point of these other electronics too. (Not the Vita though)

With iPhone (if you have an older model) you can use the Nightshift mode & adjust the white point in Display & Brightness settings via the Less Warm / More Warm slider.

If I recall the surface is very easy to calibrate you can set the white point to D65 and even load your own colour profiles. Out of the box setting for a lot of them is over 7000K.

Def give it a shot down the road and I'd highly recommend it for all your displays. I do a lot of work on screens in all types of tech and thankfully having this i1Display Pro has saved my eyes. :p Such a worthy investment and I think sometimes they have them on sale on Amazon too.
 
Have you tried Warm1 instead of the default Warm2?

I'm not even talking about Warm 2. Even Warm 1 is pretty tough to work with. But to be fair Hisense out of box picture is pretty good according to Rtings. And that picture is set to medium. That's actually my preference...
 

holygeesus

Banned
I'm not even talking about Warm 2. Even Warm 1 is pretty tough to work with. But to be fair Hisense out of box picture is pretty good according to Rtings. And that picture is set to medium. That's actually my preference...

In my experience Warm2 makes whites look ever so slightly red, but Warm1 makes whites look white. Well, as white as the whites on my MacBook display. I know it isn't industry standard, but skin tones look far more 'realistic' with temp set to Warm1 on my display. I couldn't switch back now.
 

scently

Member
This is 100% true they plan to release the tests in Q2 (which started in April). However, I'm disappointed Digital Foundry are not more transparent with the information they got. Did someone at MS assure them or did they just simply speculate themselves. If it was just pure speculation by themselves it's very annoying the way they worded that article and video if it was someone from MS telling them I'm fine with what they reported.

Oh well E3 is few weeks away so we can confirm 100% what MS plan to support.

MS has indicated the will include support for VRR on FreeSync, FreeSync2, and HDMI2.1(When the spec is rectified). What isn't clear is if Scorpio will have the full HDMI 2.1 or just VRR of the spec.
 

J-Rzez

Member
damn guys, all this 2017 models looking like world beaters is making me sad, havent even had my B6 a week :(

Im sure you saved a couple bucks though. There's some notable improvements to be had. But would they have been worth a significant price difference? That's for you to decide, as they didn't make your B6 a worse tv, its still THE tv to have from 2016 after all. But there will be improvements no matter what.

I was in the same boat with my EF9500 when the 6's hit. It still looks great. I just told myself ill sell it and get a new one in 2 or 3 years. Same thing I do with video cards. I skip a gen unless I have to get one, or there's that big of a jump.
 
Although having a Leo Bodnar input lag tester for a good while now, I never really measured the input lag of my old Sony 50W658 LCD TV.
At that time I had some problems with the USB port of the tester and later on I simply assumed that it was abt. 21 ms like the Sony W905.
Well, finally gave it another go and shocked to find out that it's about 14,6 ms which is in line with hdtvtest's measurement of the smaller Sony 42W653/655.
Now it makes even more sense to me, why I notice the difference to the 34 ms of the LG E6 OLED. The difference is not thaaat small anymore.
To be fair, I still utterly suck at online multiplayer, no matter how low the input lag. :D
 
Top Bottom