• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

To OLED, or not to OLED

What type of TV is your main TV?

  • OLED

    Votes: 441 71.4%
  • LCD

    Votes: 113 18.3%
  • Something else

    Votes: 42 6.8%
  • I don't own a TV, just a computer monitor

    Votes: 22 3.6%

  • Total voters
    618

Pimpbaa

Member
I haven’t seen newer oled panels, but wouldn’t the newer much brighter ones not have much of in issue in a brighter room? I could understand some older sub 1000 nit oled tv not being enough (I use my LG CX in a very dark room).
 

rofif

Banned
I haven’t seen newer oled panels, but wouldn’t the newer much brighter ones not have much of in issue in a brighter room? I could understand some older sub 1000 nit oled tv not being enough (I use my LG CX in a very dark room).
They are still around 1000 nits max.
But I sumy get bright room issue either.
I have c1 and during the day I use it at 120 nits in sdr when browsing the internet and I can see everything.
Idk maybe people have it under direct sunlight
 

b0uncyfr0

Member
Depends on several factors:

- Pricing
- Availability
- Brightness of the TV area (dark bedroom room or lounge room etc)
- Sunlight level (to combat glare etc)
- Gaming

If youre in a sunny country, an older OLED might not be optimal :messenger_grinning_sweat:. Its perfect for a dimly look room though.
 
Last edited:

grumpyGamer

Member
I just bought a LG oled c3, 77inch for gaming and movies.
I dont watch conventional tv and its my first oled, i am a bit worried about burn in, here where i live there is no warranty on burn in whatsoever.
Hope it was a good biy, it arrives this week
 
I just bought a LG oled c3, 77inch for gaming and movies.
I dont watch conventional tv and its my first oled, i am a bit worried about burn in, here where i live there is no warranty on burn in whatsoever.
Hope it was a good biy, it arrives this week

Shouldn't need to worry. I've been using mine with PC & Plex for the past two years and I'm still waiting for the burn in everyone says will happen with high PC usage
 

Arsic

Loves his juicy stink trail scent
The steam deck oled and hdr is so damn nice.

My TV is a 55” QLED from Samsung and the HDR is always a burden. Rarely looks good in games.

TV does look good but requires closing the shades.

I don’t see myself upgrading until it breaks since it’s 4K and gaming mode goes to 120fps.
 

FR1908

Member
Samsung premium QLED 75". With young children who can control the TV and the screen occasionally being paused for long periods of time when I'm not there, this is the best option.
 
Last edited:

Bojji

Member
Maybe it's console HDR that's bad because I play on PC and HDR is usually great.

Consoles have the same HDR quality, but you can't do shit when game doesn't support it - on PC there is auto hdr (super limited on xbox), RTX hdr and various reshade mods etc.

HDR quality is pretty much in hands of developers and your panel quality, on OLED games can look amazing and most games are better with HDR but there are examples where HDR looks like shit (thanks to incompetence from devs).
 

SteadyEvo

Member
Coming from an older midrange Samsung LCD, Sony X85j, 32 Samsung 4k qled, and a $200 insignia 4k and can safely say it’s worth it.

Got a LG C3 42in and it feels like I’ve been gaming in the Stone Age and can finally see clearly. The butter smooth framerate will take getting used to and the colors pop off the screen.

The max brightness is a bit low but the level of detail along with the frame rate make up for it. And I ended up turning mine down to 80 and enabling the blue light feature to darken the screen about for late night, no light session. Found myself playing and appreciating all the work that went into crafting this game environment.

I don’t think I’ll ever go back to LCD
 


I have never seen any flicker on my M27QP Nano-IPS LCD VRR monitor, however, I was planning to buy an OLED monitor in the future and this VRR flicker worries me. They recommend turning off VRR, but IMO it's not ideal option because there will be input lag (capping the fps a little below the max refresh rate on VRR display removes the VSYNC input lag).

Guys do you see this VRR flicker on your OLEDs? I wonder if this OLED VRR flickering is still visible when framerate is not fluctuating so much, for example when playing at locked fps (using RTSS cap), because the example in this video (drops from 100fps to 10fps) is not common during normal gameplay (only loading screens have such a huge fluctuation). For example I like to play at 167 fps lock on my 170Hz VRR monitor, and I also run some more demanding games at 60fps lock. Will there be any VRR flicker in such situations?
 
Last edited:

rofif

Banned


I have never seen any flicker on my M27QP Nano-IPS LCD VRR monitor, however, I was planning to buy an OLED monitor in the future and this VRR flicker worries me. They recommend turning off VRR, but IMO it's not ideal option because there will be input lag (capping the fps a little below the max refresh rate on VRR display removes the VSYNC input lag).

Guys do you see this VRR flicker on your OLEDs? I wonder if this OLED VRR flickering is still visible when my frame rate is not fluctuating so much, for example when playing at locked fps (using RTSS cap), because the example in this video (drops from 100fps to 10fps) is not common during normal gameplay (only loading screens have such a huge fluctuation). For example I like to play at 167 fps lock on my 170Hz VRR monitor, and I also run some more demanding games at 60fps lock. Will there be any VRR flicker in such situations?

yeah my c1 obviously flickers with vrr.
It is only a thing on shadow/grey areas with rapid framerate changes. It's different gamma curve for each HZ. I even recorded it when changing fps
tbh I don't notice it almost at all. Only on some broken pc games that jump fps during loading or something.


or on old xbox app which ran at any fps you moved the mouse lol


normally not an issue
 
yeah my c1 obviously flickers with vrr.
It is only a thing on shadow/grey areas with rapid framerate changes. It's different gamma curve for each HZ. I even recorded it when changing fps
tbh I don't notice it almost at all. Only on some broken pc games that jump fps during loading or something.


or on old xbox app which ran at any fps you moved the mouse lol


normally not an issue

Thank you. It seems they have exaggerated the problem.
 

Mozart36

Neo Member
Thank you. It seems they have exaggerated the problem.
It is not that they exaggerated the problem, it is more like they show extreme cases. In the vid you can see them showing wild framerate fluctuations, which are not that common during gameplay. I have an OLED monitor and I only see that for exmaple on the game loading screens, where fps is usually jumping like crazy. During games it is practically non existant. Of course there are times where you can see that in games too, like Dragons Dogma 2, which was poorly optimized and especially in the cities it was rough sometimes (but still bearable). If you have the money, I would say go for it and buy an OLED. You won't be disappointed.
 

Diddy X

Member
I just bought a LG oled c3, 77inch for gaming and movies.
I dont watch conventional tv and its my first oled, i am a bit worried about burn in, here where i live there is no warranty on burn in whatsoever.
Hope it was a good biy, it arrives this week

Don't worry, burn in is pretty much fixed in c3 tvs.
 
Last edited:

ChoosableOne

ChoosableAll
Burn in risk, darker image(newer ones kinda solved it?), screen glare in bright enviroment.. I'll wait for cheaper mini-led options from lg or samsung.
 

rofif

Banned
I just bought a LG oled c3, 77inch for gaming and movies.
I dont watch conventional tv and its my first oled, i am a bit worried about burn in, here where i live there is no warranty on burn in whatsoever.
Hope it was a good biy, it arrives this week
I have 8-9 thousand hours on my lg c1 48" as a ps5 and pc monitor.
no burn out and I disabled safety measures 5k hours ago (tpc and gsr, not on c3)
 

Luipadre

Gold Member


I have never seen any flicker on my M27QP Nano-IPS LCD VRR monitor, however, I was planning to buy an OLED monitor in the future and this VRR flicker worries me. They recommend turning off VRR, but IMO it's not ideal option because there will be input lag (capping the fps a little below the max refresh rate on VRR display removes the VSYNC input lag).

Guys do you see this VRR flicker on your OLEDs? I wonder if this OLED VRR flickering is still visible when framerate is not fluctuating so much, for example when playing at locked fps (using RTSS cap), because the example in this video (drops from 100fps to 10fps) is not common during normal gameplay (only loading screens have such a huge fluctuation). For example I like to play at 167 fps lock on my 170Hz VRR monitor, and I also run some more demanding games at 60fps lock. Will there be any VRR flicker in such situations?


I do on my monitor. Depends on the games and their performance. Some games you dont notice, but some are really bad. I think its tied to framepacing and fluctating fps. For example in helldivers 2 or the finals i never notice the flicker
 

King Dazzar

Member
Don't worry, burn in is pretty much fixed in c3 tvs.
Uneven pixel wear will happen. Its just a question of when. If its not for 10years, then great. And I have no doubt that use case will have an impact too.

I remember when the 2018 sets came out, no one was getting the issues we had with the likes of the C7 (I went through 2 panels). And for a few years everyone was saying burn in was resolved. But these days I've seen many more burn in related reports. My 2019 set is still gong strong, but that too has shown more issues as time goes on. At this moment in time I'd be more worried about QD-OLED than a C3. But I'd still be very mindful of prolonged HUD's and static elements. For me I feel its common sense if you want to get the most life out of one.
 

Bojji

Member


I have never seen any flicker on my M27QP Nano-IPS LCD VRR monitor, however, I was planning to buy an OLED monitor in the future and this VRR flicker worries me. They recommend turning off VRR, but IMO it's not ideal option because there will be input lag (capping the fps a little below the max refresh rate on VRR display removes the VSYNC input lag).

Guys do you see this VRR flicker on your OLEDs? I wonder if this OLED VRR flickering is still visible when framerate is not fluctuating so much, for example when playing at locked fps (using RTSS cap), because the example in this video (drops from 100fps to 10fps) is not common during normal gameplay (only loading screens have such a huge fluctuation). For example I like to play at 167 fps lock on my 170Hz VRR monitor, and I also run some more demanding games at 60fps lock. Will there be any VRR flicker in such situations?


Flicker will be there on most monitors/tvs with high contrast so OLED and VA panels will be affected. On my B2 flicker was super minimal and almost exclusive to loading screens (when framerate jumps between low and high valuses) but obviosly LG had to fuck something up and latest firmware introduced VRR gamma shift in HDR mode (SDR mode is like before). I have contacted LG Europe and LG PL (they are making tvs here) and I hope they will fix this shit in the next FW update. In the meantime here are the solutions that should work on all panels:

- keep framerate at refresh rate, flicker will only appear when framerate drops below that but at least there won't be stuttering (typical for vrr off). If you can't keep up with maximum framerate set it to lower value so: 120Hz, 100Hz, 60Hz (and 240 etc. for monitors).
- flicker free solution that works surprisingly great: 60Hz output have much bigger input lag than 120hz so use 120Hz, turn off Gsync for the game you play and limit framerate to 60 using RTSS - with that you get flicker free image with very low input lag. Every frame will be displayed 2x.

If I knew what would happen I wouldn't "upgrade" firmware but here we are 🤷‍♂️
 

GriffinCorp

Member
Last night I setup my first OLED, I was able to get a good deal on a new LG 77" C3 and I'm a believer of the hype. I was able to turn on all the video options for the Xbox Series X and wow I've been missing out for years. COD was great and I liked to 120HZ/FPS option with Game Mode.

I've heard that most of the major burn in has been fixed. That is still my main concern but so many on GAF have played tons of hours on their sets without issues. So I'm not that worried about it.

A few weeks back I ordered the Aliens 4K Blu-ray and man that looked amazing on OLED. We watched it on FilmMaker Mode but I will do more research on which sets I should be using.
 
Flicker will be there on most monitors/tvs with high contrast so OLED and VA panels will be affected. On my B2 flicker was super minimal and almost exclusive to loading screens (when framerate jumps between low and high valuses) but obviosly LG had to fuck something up and latest firmware introduced VRR gamma shift in HDR mode (SDR mode is like before). I have contacted LG Europe and LG PL (they are making tvs here) and I hope they will fix this shit in the next FW update. In the meantime here are the solutions that should work on all panels:

- keep framerate at refresh rate, flicker will only appear when framerate drops below that but at least there won't be stuttering (typical for vrr off). If you can't keep up with maximum framerate set it to lower value so: 120Hz, 100Hz, 60Hz (and 240 etc. for monitors).
- flicker free solution that works surprisingly great: 60Hz output have much bigger input lag than 120hz so use 120Hz, turn off Gsync for the game you play and limit framerate to 60 using RTSS - with that you get flicker free image with very low input lag. Every frame will be displayed 2x.

If I knew what would happen I wouldn't "upgrade" firmware but here we are 🤷‍♂️
I always keep my framerate 2-3fps below my max refresh rate (I'm using RTSS to cap framerate, because that's the only way to mitigate vsync input lag. OLEDs will still flicker in situation like that?
 
Last edited:

Bojji

Member
I always keep my framerate 2-3fps below my max refresh rate (I'm using RTSS to cap framerate, because that's the only way to mitigate vsync input lag. OLEDs will still flicker in situation like that?

Before this fucking firmware update I did exactly the same thing and it worked really well but now everything between 60 and 119FPS/Hz looks really bad in darker scenes.

I think most displays should work like my tv before the update so stable framerate should be mostly fine, if you can hit that ~115FPS (this is the cap reflex put so ~5FPS is below refresh is probably the best) most of the time it will look ok.
 

nemiroff

Gold Member
Mini LED will have a short future.. And that's a good thing (better tech incoming).

Micro LED will get there I guess
, and Interestingly several VR headsets already has 3000-5000 nits (TCL recently demoed a 10 000 nits micro-OLED display btw) 4K micro-LED panels as standard.

HOWEVER, what's on "everyone's" mind right now as a likely real contender for a new display standard is QDEL technology (Sharp). It was shown behind closed doors at CES. It's basically a self-emissive quantum dot layer (QD has until now been absorb-->re- emit). It seems a lot more promising than micro-OLED because it requires relatively small scale factory retooling. Which means it should be able to provide all the micro-OLED advantages for a much much lower price. And there's more good news: It's basically ready to go. All we need is for contracts to be written..
 
Last edited:

Jinzo Prime

Gold Member
Mini LED will have a short future.. And that's a good thing (better tech incoming).

Micro LED will get there I guess
, and Interestingly several VR headsets already has 3000-5000 nits (TCL recently demoed a 10 000 nits micro-OLED display btw) 4K micro-LED panels as standard.

HOWEVER, what's on "everyone's" mind right now as a likely real contender for a new display standard is QDEL technology (Sharp). It was shown behind closed doors at CES. It's basically a self-emissive quantum dot layer (QD has until now been absorb-->re- emit). It seems a lot more promising than micro-OLED because it requires relatively small scale factory retooling. Which means it should be able to provide all the micro-OLED advantages for a much much lower price. And there's more good news: It's basically ready to go. All we need is for contracts to be written..

I saw the digital trends video on QDEL early this year, but have been unable to find out much more about it since. Do we have anymore info about it? It is my most anticipated display technology in a long time.
 

Hoddi

Member
I got a lg c2 oled screen, i use it as a pc monitor.

Are there negative effects of putting 12 bpc on over 8?
Not really. But unless you have an SDR application that explicitly calls for 10/12bpc then you won't ever see 10/12bpc being used.

nvidia has an article on it here.
 

Hoddi

Member


I have never seen any flicker on my M27QP Nano-IPS LCD VRR monitor, however, I was planning to buy an OLED monitor in the future and this VRR flicker worries me. They recommend turning off VRR, but IMO it's not ideal option because there will be input lag (capping the fps a little below the max refresh rate on VRR display removes the VSYNC input lag).

Guys do you see this VRR flicker on your OLEDs? I wonder if this OLED VRR flickering is still visible when framerate is not fluctuating so much, for example when playing at locked fps (using RTSS cap), because the example in this video (drops from 100fps to 10fps) is not common during normal gameplay (only loading screens have such a huge fluctuation). For example I like to play at 167 fps lock on my 170Hz VRR monitor, and I also run some more demanding games at 60fps lock. Will there be any VRR flicker in such situations?

I've been using an OLED monitor for the past year (AW3423DWF) and I wouldn't sweat over it. It's very much a real phenomenon but it's not something that you notice outside of complete edge cases.

People blow it out of proportion and you shouldn't worry about it. It's not a thing that actually matters while you're using the monitor.
 

DryvBy

Member
I just bought a LG oled c3, 77inch for gaming and movies.
I dont watch conventional tv and its my first oled, i am a bit worried about burn in, here where i live there is no warranty on burn in whatsoever.
Hope it was a good biy, it arrives this week
My wife leaves Wheel of Fortune running on ours almost all day and has for months thanks to Pluto TV. I haven't had a single issue on a 2023 Bravia OLED.
 

buenoblue

Member
I don't see VRR flicker but when I switch between VRR on and off there is a very noticeable dip in HDR brightness, especially the HDR highlights.

I have Samsung S95B OLED. Anyone noticed brightness dipping on HDR with VRR on with Sony or LG oleds?

It's not a deal breaker but I've searched online and can't find any info on this. I'm mainly using PS5.
 

Bojji

Member
I got a lg c2 oled screen, i use it as a pc monitor.

Are there negative effects of putting 12 bpc on over 8?

Nor really but there are also not any benefits, 10 bits is realistically what current tvs can show (most of them) and majority of content is 8 bits (SDR) or 10 bits (HDR).

With Nvidia when set to automatic it outputs 8 bits in SDR or 10 in HDR but you can force anything you want.
 

Meicyn

Gold Member

I ended up preordering the Bravia 9 because while the 5000 nits peak is very impressive, full window peaks at 600 nits according to an article I read. My four year old Sony FALD can do more than that on full window, so that was disappointing to read. As bright as the U9N gets in 10% windows or less, the moment a flashbang happens in my gaming sessions, the U9N would have a white screen that would be slightly duller compared to what I already own. That’s a dealbreaker since I want every aspect to be an improvement if I’m throwing out thousands for a TV several years later. The X95L from last year hit 766 nits full window sustained and so the Bravia 9 should get close to the coveted 1000 nits with near 4000 nits for the smaller windows. I need a higher floor more than I need a higher ceiling in my viewing conditions.

“Brightness with a fullscreen white test pattern in the same mode was 600 nits, a result that also bests Samsung’s flagship TV. On other tests, the 75-inch U9N managed a very impressive 99.3% coverage of the UHDA-P3 color gamut and 82% of the BT.2020 color gamut, and a measured input lag of 15.2ms when in Game mode. That level of input lag isn’t the best we’ve measured – some of the best gaming TVs clock in under 10ms – but it’s still considered good enough.”

Source: https://www.msn.com/en-us/lifestyle...und-too-samsung-should-be-worried/ar-AA1o8RHp
 
Last edited:

HeisenbergFX4

Gold Member
I ended up preordering the Bravia 9 because while the 5000 nits peak is very impressive, full window peaks at 600 nits according to an article I read. My four year old Sony FALD can do more than that on full window, so that was disappointing to read. As bright as the U9N gets in 10% windows or less, the moment a flashbang happens in my gaming sessions, the U9N would have a white screen that would be slightly duller compared to what I already own. That’s a dealbreaker since I want every aspect to be an improvement if I’m throwing out thousands for a TV several years later. The X95L from last year hit 766 nits full window sustained and so the Bravia 9 should get close to the coveted 1000 nits with near 4000 nits for the smaller windows. I need a higher floor more than I need a higher ceiling in my viewing conditions.

“Brightness with a fullscreen white test pattern in the same mode was 600 nits, a result that also bests Samsung’s flagship TV. On other tests, the 75-inch U9N managed a very impressive 99.3% coverage of the UHDA-P3 color gamut and 82% of the BT.2020 color gamut, and a measured input lag of 15.2ms when in Game mode. That level of input lag isn’t the best we’ve measured – some of the best gaming TVs clock in under 10ms – but it’s still considered good enough.”

Source: https://www.msn.com/en-us/lifestyle...und-too-samsung-should-be-worried/ar-AA1o8RHp
That does seem a little low for full screen but that 5k nits is pretty sexy sounding to me :)
 


I have never seen any flicker on my M27QP Nano-IPS LCD VRR monitor, however, I was planning to buy an OLED monitor in the future and this VRR flicker worries me. They recommend turning off VRR, but IMO it's not ideal option because there will be input lag (capping the fps a little below the max refresh rate on VRR display removes the VSYNC input lag).

Guys do you see this VRR flicker on your OLEDs? I wonder if this OLED VRR flickering is still visible when framerate is not fluctuating so much, for example when playing at locked fps (using RTSS cap), because the example in this video (drops from 100fps to 10fps) is not common during normal gameplay (only loading screens have such a huge fluctuation). For example I like to play at 167 fps lock on my 170Hz VRR monitor, and I also run some more demanding games at 60fps lock. Will there be any VRR flicker in such situations?

I don't think I've ever noticed it on my C2, but I definitely notice the issue with low frame rate content

I love OLED, and I already have the new OLED iPad Pro preordered but I'm not against going Mini-LED for my next tv.
 

rofif

Banned
Some hdr from stellar blade on lg c1.
Look at the depth of colors and vibrancy of it !
6sSGZPB.jpeg

qtG71SV.jpeg

2E82IH2.jpeg

uNgtIrl.jpeg
Ub3DtJt.jpeg

g7hzZ9D.jpeg

Iphone pics still don't really show how hdr looks for someone who never saw it but it's a pretty good relative comparison.
And 2 shots comparing RAW screenshot to picture of it in HDR container before moving it out of ps5. Exporting screenshot from ps5 converts it to SDR which is pretty close to real sdr. Viewing shots in media library on ps5 keeps the hdr metadata untill you export it. I find it ususally a pretty good comparison of sdr vs hdr. Except iphone can autocorrect colors balance a bit much.
The pics are still warm looking in reality. But the pics show how highlights pop. iphone just overcoldens(lol wut) the warm tone of the game.

qtDkEbj.jpeg

6sSGZPB.jpeg

Ib7WaRd.jpeg

uNgtIrl.jpeg


And 2 more sdr vs hdr. Forspoken and ff16
Y50Sn3r.jpeg

RONgnke.jpeg

WXokjL1.jpeg

qP4PRWP.jpeg
 

Forth

Neophyte
I turned off HDR on my LGCX simply because once I got over the initial wow factor, I just started to hurt my eyes.
I do like the specular highlights it produces but it's just to much.
 
Top Bottom