• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The LG C9 OLED TV- is it bright enough for HDR video and gaming?

Kuranghi

Member
Try watching this clip with your TV's smoothing/interpolation/dejudder settings turned on at even the minimum level, this is a torture test for those types of motion settings and will show how good the motion processing is on your TV.

You'll probably regret watching it if you have them turned up, but as others in this thread say, turn them all off and enjoy the beatuifully clear feathers and tassels at 24hz.

 
Last edited:
Black Friday time. I'm on the fence. Convince me.

What is keeping me from going all in is the judder issue in novies and the brightness variance. Should I worry?
Haha Haha. Why convince you? I'm enjoying mine like the rest of us. Man up and choose yourself.

And just for shits and giggles: worry. Lol
 

Skyr

Member
In the mean time, this person on AVSforums sets it exactly how I do. I came from a B7A, which had very noticeable suttuer


Post 3478


That has been my impression so far. It does still stutter in some places, but it seems to be overall much smoother

A slight improvement would probably be sufficient for me.

Still lets you wonder how much the C10 and later models will improve this further. But as it's based on the tech I would guess it will never be completely eliminated.
I also wonder if micro LED Displays will deal with the same problem.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Try watching this clip with your TV's smoothing/interpolation/dejudder settings turned on at even the minimum level, this is a torture test for those types of motion settings and will show how good the motion processing is on your TV.

You'll probably regret watching it if you have them turned up, but as others in this thread say, turn them all off and enjoy the beatuifully clear feathers and tassels at 24hz.


Thank you very much for this. I will test and compare it on my B7A and C9
 

JohnnyFootball

GerAlt-Right. Ciriously.
Black Friday time. I'm on the fence. Convince me.

What is keeping me from going all in is the judder issue in novies and the brightness variance. Should I worry?
On second thought. Just give up TV watching all together. No reason to ever buy a new TV again.
 

Tygeezy

Member
A slight improvement would probably be sufficient for me.

Still lets you wonder how much the C10 and later models will improve this further. But as it's based on the tech I would guess it will never be completely eliminated.
I also wonder if micro LED Displays will deal with the same problem.
Microled will deal with the same problem because its sample and hold technology and it too has sub 1ms response time.
 

Kuranghi

Member
We spent all these years trying to make LCD motion as good as CRT/Plasma and now we'll be back to square one with OLED/MicroLED :( The future does not look bright for smooth 24-30hz motion. Maybe we all just need to upgrade our eyes to fix this problem ha.

I've been reading the last few days about input lag and frametimes and how they are related. Does an input lag of less than 33.33~ms really matter when you are at 30hz? Or less than 16.66ms at 60 hz? Obviously you won't see the effect of your input until the frame is updated but I'm not sure if the game engine would process your input until the next frame, hence my question.

I fell down a blurbusters rabbit hole regarding this, can anyone weigh in on it? Maybe even with some links to a technical description of whats happening when you send an input to a game inbetween frames.

edit - Forgot, this was the comment that made me start thinking about the whole issue - https://linustechtips.com/main/topi...aming-input-lag/?tab=comments#comment-9500179

edit 2 - also please post your reaction time for willy waving purposes, its linked in the comment above but here it is here as well - https://www.humanbenchmark.com/tests/reactiontime - I know my profile picture makes me look young but apparently I'm a worn out old man now because my average was 260ms haha.
 
Last edited:

Tygeezy

Member
We spent all these years trying to make LCD motion as good as CRT/Plasma and now we'll be back to square one with OLED/MicroLED :( The future does not look bright for smooth 24-30hz motion. Maybe we all just need to upgrade our eyes to fix this problem ha.

I've been reading the last few days about input lag and frametimes and how they are related. Does an input lag of less than 33.33~ms really matter when you are at 30hz? Or less than 16.66ms at 60 hz? Obviously you won't see the effect of your input until the frame is updated but I'm not sure if the game engine would process your input until the next frame, hence my question.

I fell down a blurbusters rabbit hole regarding this, can anyone weigh in on it? Maybe even with some links to a technical description of whats happening when you send an input to a game inbetween frames.

edit - Forgot, this was the comment that made me start thinking about the whole issue - https://linustechtips.com/main/topi...aming-input-lag/?tab=comments#comment-9500179

edit 2 - also please post your reaction time for willy waving purposes, its linked in the comment above but here it is here as well - https://www.humanbenchmark.com/tests/reactiontime - I know my profile picture makes me look young but apparently I'm a worn out old man now because my average was 260ms haha.
Even if your reaction time is legit 260 ms I can promise you that you can tell the difference between say 120 ms and 35 ms input lag. In fact, all you have to do it fire up cs go at 60 hz and use vsync with an uncapped framerate. Those measured input lag times are well north of 100 ms where people complain about "mouse lag" which is actually just input lag. Panning the camera with a mouse is much easier to notice input lag than pressing a button and waiting for a response. It's also important to note that these figures for input lag that are being referenced for these tv's is only the displays portion of the input lag chain. 33 ms total input lag is excellent, but there is no way you are getting that low of input lag at 30 fps since that would only be the render times part of the chain without factoring in input devices, display lag, game engine ect.

 
Last edited:

kittoo

Cretinously credulous
So its been some 2-3 days since I have owned the C9. I am slowly settling down on settings and what not. Initially I thought the 'technicolor expert' setting every recommended made the colors look unnatural. But after going back and forth quite a bit, especially when playing fallen order, I have realized that this setting does give the best HDR results IMHO.
And the blacks! Wow! I was in a dark cave in the game and goddamn I really could not see anything other than what the torch illuminated. It was pitch black. It wasn't grey or slightly illuminated somehow where I could still things like it has always been in other TVs for me. No, it was pitch black. Made the gaming experience so much better.
Yes sir, slowly I am shifting from 75% satisfied to 100%.
 

Arkage

Banned
One thing I would look into is if there's lag through Dolby Vision. I have a C6 and its great in everything except Dolby Vision where it lags significantly from the audio. Pain in the ass and just something I've learned to dealt with, especially since more streaming shows are now coming out in that format. I have everything running through my receiver so I can't adjust lag for DV without then screwing up the other formats.
 

JohnnyFootball

GerAlt-Right. Ciriously.
So its been some 2-3 days since I have owned the C9. I am slowly settling down on settings and what not. Initially I thought the 'technicolor expert' setting every recommended made the colors look unnatural. But after going back and forth quite a bit, especially when playing fallen order, I have realized that this setting does give the best HDR results IMHO.
And the blacks! Wow! I was in a dark cave in the game and goddamn I really could not see anything other than what the torch illuminated. It was pitch black. It wasn't grey or slightly illuminated somehow where I could still things like it has always been in other TVs for me. No, it was pitch black. Made the gaming experience so much better.
Yes sir, slowly I am shifting from 75% satisfied to 100%.
giphy.gif
 

Venuspower

Member
thought the 'technicolor expert' setting every recommended IMHO.

The Technicolor setting is not even the most accurate one since
it is using a "wrong" white point (They were using a projector as a reference for that picture mode; Edit 1: Xenon DCI Cinema Projector is the one they are trying to match). Edit 2: This only counts for Warm 1 (which is the default color temperature that is used by Technicolor). Other color temperatures will use D65 whitepoint.

Most accurate picture mode you can find:
- ISF Expert Dark Room (SDR) [just a tiny bit better than the bright room version]
- ISF Expert Bright Room (SDR)
- Technicolor (with e.g. Warm 2) (But still not as good as the Dark Room mode according to some calibrators)
- Cinema (For HDR and DV)

€: Do not get me wrong here. In the end everyone is allowed to use the picture mode (s)he likes the most :D
 
Last edited:

kittoo

Cretinously credulous
The Technicolor setting is not even the most accurate one since
it is using a "wrong" white point (They were using a projector as a reference for that picture mode; Edit 1: Xenon DCI Cinema Projector is the one they are trying to match). Edit 2: This only counts for Warm 1 (which is the default color temperature that is used by Technicolor). Other color temperatures will use D65 whitepoint.

Most accurate picture mode you can find:
- ISF Expert Dark Room (SDR) [just a tiny bit better than the bright room version]
- ISF Expert Bright Room (SDR)
- Technicolor (with e.g. Warm 2) (But still not as good as the Dark Room mode according to some calibrators)
- Cinema (For HDR and DV)

€: Do not get me wrong here. In the end everyone is allowed to use the picture mode (s)he likes the most :D

Thanks for the reply. I guess I got some more setting to do :p
Also, is dynamic constrast always greyed out in HDR? I can't change it at all as long as I am playing anything with HDR.
I am also just as happy :)
 
Last edited:

Amaranty

Member
Clear. It works wonders. not completely perfect, but it's much better than before on the non-Alpha 9 models.

Do you also play on consoles? The stutter on OLED doesn't bother you in 30 fps games? I have Sony X900F myself and the stuttering in RDR 2 was a bit distracting sometimes. I imagine it's even worse on OLED.

Does the clear setting on C9 turns on BFI setting?
 

Tygeezy

Member
Do you also play on consoles? The stutter on OLED doesn't bother you in 30 fps games? I have Sony X900F myself and the stuttering in RDR 2 was a bit distracting sometimes. I imagine it's even worse on OLED.

Does the clear setting on C9 turns on BFI setting?
“Oled motion” turns on black frame insertion.
 

Venuspower

Member
Also, is dynamic constrast always greyed out in HDR? I can't change it at all as long as I am playing anything with HDR.

Yep.
I would not recommend using Dynamic Contrast anyways.
IMHO it does not add any benefit. Most of the time it even looks bad.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Do you also play on consoles? The stutter on OLED doesn't bother you in 30 fps games? I have Sony X900F myself and the stuttering in RDR 2 was a bit distracting sometimes. I imagine it's even worse on OLED.

Does the clear setting on C9 turns on BFI setting?
No, BFI is the most pointless setting i have ever seen. I have not once in any circumstance found it to be useful. I know it has a purpose, but for me all it does cause a noticeable flicker and darken the image.
As for 30 fps:
Its distracting when going to 30 from 60 or higher. And yes, 30 fps does have a bit of stutter, its at its absolute worst on the Witcher 3 when moving across the map. But I get used to it after a while and don't notice it. I will never use any form of motion interpolation on console gaming. It adds wayyy too much input lag and does strange things to the image that is far more distracting.

Motion Interpolation is not a feature that was ever intended for gaming and I dont recommend it EVER.

On that note, Samsung is so far the only TV maker in which you can enable a form of motion interpolation without it drasticlly increasing the input lag. I havent seen this feature myself since I don't buy Samsung and won't consider them until support DOlby Vision.
 

jts

...hate me...
A little black friday week sale and boom, I’ve now ordered a 55” C9. Feels bad replacing a 2017 Sony XE9005 but I couldn’t live with the regret of not getting an OLED instead. Besides I got a mate to offload that TV to, for half of what I paid. I’m content with that.

I’ll still sit on it for 1 week in case BF proper brings an even better deal, but boy am
I excited. Switch games will never look better :D
 

Siri

Banned
Congratulations on being the second owner of your new TV 😊

The panel was coated with a plastic film, which had to be peeled away - no way could that have been put back on by the store after a return. When I unboxed the display it was obviously from the factory.

So the 55 inch C9 ships with an HDMI cable - you’re 100% confirming this?
 

JohnnyFootball

GerAlt-Right. Ciriously.
Congratulations on being the second owner of your new TV 😊
The panel was coated with a plastic film, which had to be peeled away - no way could that have been put back on by the store after a return. When I unboxed the display it was obviously from the factory.

So the 55 inch C9 ships with an HDMI cable - you’re 100% confirming this?
No he is (somehow) misunderstanding your post or doesn't know what he is talking about. The TV does not ship with an HDMI cable. I've bought two LG OLEDs and not one of them has shipped with an HDMI cable. I haven't owned a single TV that ever has.

If it were returned and bought an open box it would have been made known to you.

Not to mention that LG TVs boxes are held together with straps that once broken cant be put back on.
Unless the person who bought it went through the trouble of reapplying they coating and bought new straps that require special equipment to apply there is pretty much no way you are not the first owner of the TV.

You're good dude.
 
Last edited:

DeepEnigma

Gold Member
So its been some 2-3 days since I have owned the C9. I am slowly settling down on settings and what not. Initially I thought the 'technicolor expert' setting every recommended made the colors look unnatural. But after going back and forth quite a bit, especially when playing fallen order, I have realized that this setting does give the best HDR results IMHO.
And the blacks! Wow! I was in a dark cave in the game and goddamn I really could not see anything other than what the torch illuminated. It was pitch black. It wasn't grey or slightly illuminated somehow where I could still things like it has always been in other TVs for me. No, it was pitch black. Made the gaming experience so much better.
Yes sir, slowly I am shifting from 75% satisfied to 100%.

The Technicolor setting is not even the most accurate one since
it is using a "wrong" white point (They were using a projector as a reference for that picture mode; Edit 1: Xenon DCI Cinema Projector is the one they are trying to match). Edit 2: This only counts for Warm 1 (which is the default color temperature that is used by Technicolor). Other color temperatures will use D65 whitepoint.

Most accurate picture mode you can find:
- ISF Expert Dark Room (SDR) [just a tiny bit better than the bright room version]
- ISF Expert Bright Room (SDR)
- Technicolor (with e.g. Warm 2) (But still not as good as the Dark Room mode according to some calibrators)
- Cinema (For HDR and DV)

€: Do not get me wrong here. In the end everyone is allowed to use the picture mode (s)he likes the most :D
Do you have a link to those settings?
 
Last edited:

Amaranty

Member
No, BFI is the most pointless setting i have ever seen. I have not once in any circumstance found it to be useful. I know it has a purpose, but for me all it does cause a noticeable flicker and darken the image.
As for 30 fps:
Its distracting when going to 30 from 60 or higher. And yes, 30 fps does have a bit of stutter, its at its absolute worst on the Witcher 3 when moving across the map. But I get used to it after a while and don't notice it. I will never use any form of motion interpolation on console gaming. It adds wayyy too much input lag and does strange things to the image that is far more distracting.

Motion Interpolation is not a feature that was ever intended for gaming and I dont recommend it EVER.

On that note, Samsung is so far the only TV maker in which you can enable a form of motion interpolation without it drasticlly increasing the input lag. I havent seen this feature myself since I don't buy Samsung and won't consider them until support DOlby Vision.
I'm considering giving my X900F to my parents but I'm unable to decide what TV to purchase. Samsung Q90R sounds great on paper but Rtings states that game mode uses 120 Hz PWM dimming, which causes motion duplication. Sony X900F also reduced it's PWM dimming to 120 Hz when BFI is turned on and then motion looks quite abnormal. I wouldn't want to use game motion plus on Samsung TV since input lag is around 40 ms.

The only other option currently is an OLED, either B9 or C9, since they don't use PWM dimming.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I'm considering giving my X900F to my parents but I'm unable to decide what TV to purchase. Samsung Q90R sounds great on paper but Rtings states that game mode uses 120 Hz PWM dimming, which causes motion duplication. Sony X900F also reduced it's PWM dimming to 120 Hz when BFI is turned on and then motion looks quite abnormal. I wouldn't want to use game motion plus on Samsung TV since input lag is around 40 ms.

The only other option currently is an OLED, either B9 or C9, since they don't use PWM dimming.
The C9 is the better choice of the two since it is supposed that only the alpha 9 processor found in the C9 will support 4K 120 Hz HDR while the B9 will supposedly only support SDR at 4K 120Hz since it only features the alpha 7 processor.

I have only seen that come from 1 source and can’t say for sure.
 

jts

...hate me...
The C9 is the better choice of the two since it is supposed that only the alpha 9 processor found in the C9 will support 4K 120 Hz HDR while the B9 will supposedly only support SDR at 4K 120Hz since it only features the alpha 7 processor.

I have only seen that come from 1 source and can’t say for sure.
This is a bit of a weird question but does the C9 support PiP or any kind of windowed mode for content? With these large TVs I like to sometimes emulate a smaller screen for less light exposure (especially for my toddler as sleeping time approaches).
 

DeepEnigma

Gold Member
Are C9's next gen proof or are there things to hold off on for the inevitable C10 next year?

Yes, all the HDR modes, 120hz and VRR.

C10 will probably have a faster processor/improvements to the A9 chip. Maybe more improvements to the 24p viewing. I can’t imagine it getting any lower in response time, it is already fantastically low, but we shall see.
 
Last edited:

DeepEnigma

Gold Member
Which settings?
ISF Expert Dark Room etc.?
They come with the TV itself.
They can not be downloaded.

Just go into the picture settings. You will find dozens of picture modes to choose from.

Ah ok, you’re talking about presets. My bad. Thought you were going off a tweak guide somewhere.

I’m strongly considering the C9.
 
Last edited:

Skyr

Member
So I just realized that gsync on the c9 is actually only supported for Turing/RTX cards. Why would it be limited to them? Is it really hardware limited? That fucking sucks as I’m still on a 1080ti.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I really had no idea that this issue was improved with the c9.
I have motion interpolation off on my c7 as I can’t stand the soap opera effect. But at the same time the stuttering in movies whenever the camera is doing panning motions is bothering me a lot.

I am actually thinking about putting my 55c7d on ebay to get a 65c9 on Black Friday. Or should I wait for the c10 for 1 year? What upgrades can we expect from the c10 compared to the c9?
To follow up on this, I put in my Planet Earth II 4K disc, as it has a LOT of panning shots and those are the ones most prone to stutter. I compared with TruMotion off and On-Clear setting.

I'll just say this: Clear is an absolute fucking GODSEND for getting rid of stutter without introducing artifacts or the soap opera effect. I turned it off and on and it simply made the off setting look horrible. The stutter isn't completely gone, but it's really minimized and I'm blown away by how incredible it works.

I don't recommend it for gaming at all, I tested both settings on the Witcher 3 and the artifacts it created was just awful and not once did the smoother movement look natural and it created a lot of artifacts. There is simply not much you can do for 30fps games.
 

JohnnyFootball

GerAlt-Right. Ciriously.
So I just realized that gsync on the c9 is actually only supported for Turing/RTX cards. Why would it be limited to them? Is it really hardware limited? That fucking sucks as I’m still on a 1080ti.
Yes, but its limited to 60Hz at 4K so it's not like it's reaching it's true potential on Turing. However, if you drop the resolution to 1440p, the TV can do 120 Hz and the TV does an outstanding job upscaling 1440p to 4K.
 

Skyr

Member
Yes, but its limited to 60Hz at 4K so it's not like it's reaching it's true potential on Turing. However, if you drop the resolution to 1440p, the TV can do 120 Hz and the TV does an outstanding job upscaling 1440p to 4K.
Ye it’s just a bummer that gsync won’t work for me for now.
Well, the important thing is that it’s future proof and hopefully 30XX cards will have hdmi 2.1 to realize 4K 120hz with gsync. Man that will be sweet.
 

Skyr

Member
To follow up on this, I put in my Planet Earth II 4K disc, as it has a LOT of panning shots and those are the ones most prone to stutter. I compared with TruMotion off and On-Clear setting.

I'll just say this: Clear is an absolute fucking GODSEND for getting rid of stutter without introducing artifacts or the soap opera effect. I turned it off and on and it simply made the off setting look horrible. The stutter isn't completely gone, but it's really minimized and I'm blown away by how incredible it works.

I don't recommend it for gaming at all, I tested both settings on the Witcher 3 and the artifacts it created was just awful and not once did the smoother movement look natural and it created a lot of artifacts. There is simply not much you can do for 30fps games.
Sounds good. The important thing for me is that you don’t notice any soap opera effect. But as you owned the previous model yourself and can compare, there has to be at least an improvement. Anyway I’ve pretty much made my decision to get it next week. I actually have the planet earth 2 UHD Blu-ray myself so I will be able to test it right away and report what I think.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Sounds good. The important thing for me is that you don’t notice any soap opera effect. But as you owned the previous model yourself and can compare, there has to be at least an improvement. Anyway I’ve pretty much made my decision to get it next week. I actually have the planet earth 2 UHD Blu-ray myself so I will be able to test it right away and report what I think.
It may not be on sale next week.
 

dsk1210

Member
To follow up on this, I put in my Planet Earth II 4K disc, as it has a LOT of panning shots and those are the ones most prone to stutter. I compared with TruMotion off and On-Clear setting.

I'll just say this: Clear is an absolute fucking GODSEND for getting rid of stutter without introducing artifacts or the soap opera effect. I turned it off and on and it simply made the off setting look horrible. The stutter isn't completely gone, but it's really minimized and I'm blown away by how incredible it works.

I don't recommend it for gaming at all, I tested both settings on the Witcher 3 and the artifacts it created was just awful and not once did the smoother movement look natural and it created a lot of artifacts. There is simply not much you can do for 30fps games.

Picked up my C9 on friday, upgraded from a B6 and I am very happy with how clear looks on movies and TV compared to the B6. Smooth panning without looking like framerate has been increased which produces the soap opera effect, it worked on the B6 but produced too many artifacts to be useful.

The only time I have seen the tv struggle is panning through a finely detailed forest, slight artifacts at times but very impressed.

1440p at 120hz is awesome as well and has me hyped for when the HDMI 2.1 cards come out next year.
 

beck_

Member
I am getting a 55" C9 in black friday sales (if no sale, will buy it anyway), quick question if anyone can help please..

I have a AV unit which will have all my hdmi inputs, Ill then hdmi from AV to tv. When Im gaming on PS4 Pro, will it auto switch to game mode and use specific display settings in that mode?

On my current tv, I have to hdmi my ps4 direct to tv with game mode etc set on that HDMI channel. Rest come from AV receiver. I could do this same setup for C9, but hoping I can put all through 1 link and it will auto switch settings when gaming etc.

Oh and while Im here, I intend to change display settings based on Rtings setup, is this recommended?

Thanks for any help.
 

Skyr

Member
I am getting a 55" C9 in black friday sales (if no sale, will buy it anyway), quick question if anyone can help please..

I have a AV unit which will have all my hdmi inputs, Ill then hdmi from AV to tv. When Im gaming on PS4 Pro, will it auto switch to game mode and use specific display settings in that mode?

On my current tv, I have to hdmi my ps4 direct to tv with game mode etc set on that HDMI channel. Rest come from AV receiver. I could do this same setup for C9, but hoping I can put all through 1 link and it will auto switch settings when gaming etc.

Oh and while Im here, I intend to change display settings based on Rtings setup, is this recommended?

Thanks for any help.
The ps4 pro doesn’t support auto low latency mode, only the Xbox one x does yet. Even then you have to make sure your av receiver can pass through the signal. My denon receiver could do it for example.
 
Last edited:
Top Bottom