• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Samsung QN65QN90A TV Review: Mini LED Marvel

Kuranghi

Member
Thanks for the comprehensive advice. I really only care about gaming performance - I watch movies and TV pretty rarely. My apartment gets a fair amount of natural light, particularly in the mornings (east-facing windows), and it has been a source of annoyance for me in the past. I guess I'll wait for the new generation of sets - do you think any of the LCDs will launch at ~$1500 for a 65-inch?

I wouldn't buy TVs at launch tbh, wait til the following year and they will be 40% off usually. If you won't watch movies/TV then the LG OLED is almost perfect, I say buy an LG OLED from online and try it out in your space, if the window/light is a problem then you can return it, in the UK at least they have to let you do that if you bought it without seeing it, even if you get it out and everything. Double check how it works where you live.

Nearly the best time to buy LG C9/CX right now, its the cheapest it will be except clearance and then you risk missing the stock because as soon as it hits minimum price everyone will buy them in a few days/a week.
 

ZywyPL

Banned
People keep harping on about this LCD HDR brightness advantage and I just don't get it, is this simply spec sheet nonsense that doesn't translate to real world watching?. I have a 55LGCX and to be perfectly honest the fucking thing gets bright enough for me, any 4K movie i've watched on it, especially in the evenings when i do most of my movie watching those HDR highlights of say a headlight or explosion against the inky black look incredibly bright and realistic.. i mean how much brighter can it really get ffs before it overwhelms the picture or becomes uncomfortable to watch.. i think from memory the CX tops out around 5 or 600 nits and here we are talking about 1500 - 2000 nits.. from where I'm sitting that just sounds like its gonna fry my fucking eyeballs

Fully agree here. I have a GX and I use it at 60% brightness because anything more is simply uncomfortable for my eyes for more than 1h, if I'd get any of those 1000-1500nit TVs I guess I'd have to turn the brightness all the way down to 20-30%, so what's the point, other than marketing campaigns where more=better? And HDR already melts my eyes unless I'm watching an all-natural content like some sort of National Geographic and stuff - any subtitles, HUD elements, pop-up texts etc. make my eyes bleed (literally) after just 1-1,5h, the brightness might not be the highest (770 from what I read), but combined with perfect blacks/infinite contrast it's a really deadly combination.
 

Kuranghi

Member
I think they would be fine with any of them. Just check how good the hdmi 2.1 support is (vrr, 120hz). Samsung game mode still has that problem with hdr raised blacks I think so maybe look into that but people way overestimate how much the panel will make a difference in games. To be honest most people would probably be happy with random $500 tv if they weren't following hype. Like one of my old 2015 4k edge lit Samsung non HDR tv have like 80% of the image quality that my LG CX has in 4k

While I well know that a $500 set will impress in isolation I can't agree with your last statement, an edge lit TV doesn't have anywhere near the HDR pop of an OLED, and certain scenes will look garbage on the edge lit set compared to an OLED. Like a mass of blue and white blooming vs. inky black in a space shot for example.

It looks great until you see what you were missing, I know that goes for any tech upgrade but this change is particularly amazing, like Macready said above, OLED smashes the "wife test", which is one of the hardest tests.
 

Andodalf

Banned
People keep harping on about this LCD HDR brightness advantage and I just don't get it, is this simply spec sheet nonsense that doesn't translate to real world watching?. I have a 55LGCX and to be perfectly honest the fucking thing gets bright enough for me, any 4K movie i've watched on it, especially in the evenings when i do most of my movie watching those HDR highlights of say a headlight or explosion against the inky black look incredibly bright and realistic.. i mean how much brighter can it really get ffs before it overwhelms the picture or becomes uncomfortable to watch.. i think from memory the CX tops out around 5 or 600 nits and here we are talking about 1500 - 2000 nits.. from where I'm sitting that just sounds like its gonna fry my fucking eyeballs




It's not just about having a brighter image, it's about more accurately representing source material. OLED is getting there though.
 

vpance

Member
Fully agree here. I have a GX and I use it at 60% brightness because anything more is simply uncomfortable for my eyes for more than 1h, if I'd get any of those 1000-1500nit TVs I guess I'd have to turn the brightness all the way down to 20-30%, so what's the point, other than marketing campaigns where more=better? And HDR already melts my eyes unless I'm watching an all-natural content like some sort of National Geographic and stuff - any subtitles, HUD elements, pop-up texts etc. make my eyes bleed (literally) after just 1-1,5h, the brightness might not be the highest (770 from what I read), but combined with perfect blacks/infinite contrast it's a really deadly combination.

There’s a degree of acclimation time to the increased brightness. I felt the same way coming from my plasma to a FALD but got used to it.

Also if you’re watching in a dark room as many OLED owners prefer it’s not unusual to feel like this, because your eye has to work harder as the brightness level ramps up and down from dark and bright. Having any amount of light on (bias lighting or otherwise) helps a lot.
 

kraspkibble

Permabanned.
Love my CX, but I want some new desktop monitors and OLED don't come in that small of size. Will MicroLED come in 27-32 inches?
LG are making 20-42” OLED panels this year so expect to see OLED monitors soon

Edit: LG isn’t selling directly to customers. They are making the panels and brands like Asus, Acer, Dell, etc will buy the panels for use in their own products. LG is only selling 48” or higher TVs.

and by soon I mean probably like late 2022.
 
Last edited:
I wouldn't buy TVs at launch tbh, wait til the following year and they will be 40% off usually. If you won't watch movies/TV then the LG OLED is almost perfect, I say buy an LG OLED from online and try it out in your space, if the window/light is a problem then you can return it, in the UK at least they have to let you do that if you bought it without seeing it, even if you get it out and everything. Double check how it works where you live.

Nearly the best time to buy LG C9/CX right now, its the cheapest it will be except clearance and then you risk missing the stock because as soon as it hits minimum price everyone will buy them in a few days/a week.
Alright, you sold me. Just picked one up for $2k on Amazon. Arrives Sunday!

Definitely not waiting another year to upgrade, so this seems like as good a time as any.
 

Kuranghi

Member
People keep harping on about this LCD HDR brightness advantage and I just don't get it, is this simply spec sheet nonsense that doesn't translate to real world watching?. I have a 55LGCX and to be perfectly honest the fucking thing gets bright enough for me, any 4K movie i've watched on it, especially in the evenings when i do most of my movie watching those HDR highlights of say a headlight or explosion against the inky black look incredibly bright and realistic.. i mean how much brighter can it really get ffs before it overwhelms the picture or becomes uncomfortable to watch.. i think from memory the CX tops out around 5 or 600 nits and here we are talking about 1500 - 2000 nits.. from where I'm sitting that just sounds like its gonna fry my fucking eyeballs

Andodalf Andodalf summed it up pretty succintly but here my take:

Its nothing to do with the actual brightness/backlight/OLED light setting on the TV, or the overall amount of light thats coming out of the TV at any one time, its to do with how accurate/good it makes the HDR look.

Tell me if this analogy works for you: Its like showing a topographic map in higher and higher resolution, on the first map the peaks were "chopped off" because the lines represented a difference in height that was larger than from the line just below the peak to the top of the peak so it can't be represented on the map. So if you saw the moutain in 3D it would get more and more detailed and accurately modelled as you decreased the difference in elevation that each line represented.

That awesome difference in depth of the image you saw when going from your old set to the OLED will only increase as peak brightness increases, to the point where you will have the perfect black level AND the faces in that video I posted will look their most 3D/be rendered exactly as the source dictates. At that point, with sufficiently high video quality, it will look like a window so say for a super wide shot in an action movie with lots of guys running around and cars exploding it will take on diorama style effect and you will think you can put your hand into it.

Thats what increased brightness does, its not nothing to do with how bright it can make the screen but its not the main appeal to doing it. That shot in The Matrix where Morpheus is talking to Neo in the white room is like 1000 nits full field, no TV can really do that right now so you can't actually see it how they meant for it to be seen until a TV can do that.
 

OnionSnake

Banned
LG are making 20-42” OLED panels this year so expect to see OLED monitors soon

Edit: LG isn’t selling directly to customers. They are making the panels and brands like Asus, Acer, Dell, etc will buy the panels for use in their own products. LG is only selling 48” or higher TVs.

and by soon I mean probably like late 2022.
Ah okay might wait for that then thanks
 

dolabla

Member
It only having 1 HDMI 2.1 port is pretty damn stupid from Samsung. At least include 2, especially when two next gen consoles that they knew were coming out had HDMI 2.1.

No Dolby Vision.
Yep, I don't think they're ever going to have it.
 

MadPanda

Banned
I love aul innuendo man Teoh but he's looking at these telly's through an extremely experienced calibrated eyeball lol but to the rest of us plebs were not even going to notice these details sat on a couch 12-14ft away, all i see is inky blacks, great colour and super bright highlights on my Oled. I think TV's really are getting to a point where the differences can only be picked up by the likes of Vincent
You should watch that video. Common people can see the difference in his example.
 

Kuranghi

Member
Alright, you sold me. Just picked one up for $2k on Amazon. Arrives Sunday!

Definitely not waiting another year to upgrade, so this seems like as good a time as any.

What exact model did you have before? Its on the back of the TV, on a grey or white sticker. Can give you an idea of how much of an upgrade it will be.
 

Rikkori

Member
Samsung disappoints again huh? Maybe the 8K model will do better, last year only Q90T wasn't gimped in game mode. Sigh.

478MzWt.png
 

vpance

Member
Andodalf Andodalf summed it up pretty succintly but here my take:

Its nothing to do with the actual brightness/backlight/OLED light setting on the TV, or the overall amount of light thats coming out of the TV at any one time, its to do with how accurate/good it makes the HDR look.

Tell me if this analogy works for you: Its like showing a topographic map in higher and higher resolution, on the first map the peaks were "chopped off" because the lines represented a difference in height that was larger than from the line just below the peak to the top of the peak so it can't be represented on the map. So if you saw the moutain in 3D it would get more and more detailed and accurately modelled as you decreased the difference in elevation that each line represented.

That awesome difference in depth of the image you saw when going from your old set to the OLED will only increase as peak brightness increases, to the point where you will have the perfect black level AND the faces in that video I posted will look their most 3D/be rendered exactly as the source dictates. At that point, with sufficiently high video quality, it will look like a window so say for a super wide shot in an action movie with lots of guys running around and cars exploding it will take on diorama style effect and you will think you can put your hand into it.

Thats what increased brightness does, its not nothing to do with how bright it can make the screen but its not the main appeal to doing it. That shot in The Matrix where Morpheus is talking to Neo in the white room is like 1000 nits full field, no TV can really do that right now so you can't actually see it how they meant for it to be seen until a TV can do that.

TVs will only get brighter and it may no longer make any sense to watch in near complete darkness for some HDR heavy material. That's where the bulk of the problem comes for people sensitive to high nits. Eventually we'll get that window of reality effect but eye strain will be something to deal with no matter what. Good news for those people is they can just put the backlight slider down.
 

Kuranghi

Member
TVs will only get brighter and it may no longer make any sense to watch in near complete darkness for some HDR heavy material. That's where the bulk of the problem comes for people sensitive to high nits. Eventually we'll get that window of reality effect but eye strain will be something to deal with no matter what. Good news for those people is they can just put the backlight slider down.

I do think eye strain will be an issue yes, I already feel that now with my ZD9, I have to turn down the backlight to minimum AND turn on the light sensor for SDR content when its pitch dark or else its just too much.

I think you'll always need to view reference HDR in a dark room though, because most of the image will still be in the SDR range if HDR is done right. The thing that could be great for bright room viewing though is Dolby Vision IQ or similar tech, which lifts the whole image in various ways to make it more viewable in a bright room. It would lower the dynamic range and depth of image if you don't have tons of headroom on the panel, but making it possible to view in the first place while not totally destroying the image would be amazing.
 

Kuranghi

Member

I have more "bad TV porn" if you are interested? :messenger_tears_of_joy:

That TV:

J9_IJgtKnS_pWLgu-lEEj8ZzFDFVjChimKHhwDYDQREdXV_TSK5FPsn5E8cktlc3X5-uxt6B3xXzQRD1-sMeQxiRPd2-LopJcQThtMHckCabAPys2lVP_cUB6REzcQz-dYSpJwvOxyQUBMV9tFS3w05JQ7h_mkfpqo-gyE1f6iY9HUj8j_w7zpo4bc8qqvVgb_LzHt0sDpn5dU0g_6lhwvHgmlqoBdNI9cX6LULHqYfhf_ZkkJgYPf0IzDLKEIclDqOuH2AMsIx-7w1nBrUWS-zpM4UHueu8VPQTHLlhOQz_QzKBzGCKZiqflxNqytIktc9X7BinI_QY4oW8tAGQ9XaJ8ZfnaZ-mRN1J2_rS03LLnFCLlBYK2zAWx9QbXrDvNU5ibSPp4c_qJi3Edfjh9tBEY8j5hYlHR16j2GezXKcG7q264bs3oHjRvAjRWFqsGHWXhORHrYX0lhw_pKHfJVN8vf49tonYV5tc65dnqrAOME2skWLAwJvsHkmIQZ_YAXiXqSbzleUFDXuRKVUK9u-kRT5P-rND_4T0wxLwsFKtRK8kgfMlkcIgX0oOxYx2Cd2VZADaGV79Nzkkai4X4xyKYMXMey3DohgjRkg9EnIam3BuE816HNVkCHRl55pqA9y3qB89azIVIkbOIyGOSPd_hy7Plqy1IxsXqtu5K40ebn6dy4DGhlXa6yx9nw=w795-h491-no


Screenshot I just took in my browser from Amazon Prime as an idea of what it should/could look like:

bpIN6NN.png


Can any of our eagle eyed GAFers spot the difference?

Thats why you don't buy an LG LCD for HDR viewing.
 
Last edited:

vpance

Member
I do think eye strain will be an issue yes, I already feel that now with my ZD9, I have to turn down the backlight to minimum AND turn on the light sensor for SDR content when its pitch dark or else its just too much.

I think you'll always need to view reference HDR in a dark room though, because most of the image will still be in the SDR range if HDR is done right. The thing that could be great for bright room viewing though is Dolby Vision IQ or similar tech, which lifts the whole image in various ways to make it more viewable in a bright room. It would lower the dynamic range and depth of image if you don't have tons of headroom on the panel, but making it possible to view in the first place while not totally destroying the image would be amazing.

Yep, it ties back to the HDR being too dim complaints as well so better options to tune the range of brightness would be welcome to see.

I imagine as TV tech advances closer to becoming windows of reality there will be more integration with the room's lighting. Realism may start to take priority vs reference viewing, but reference viewing can always be an option.
 

vpance

Member
What does that 10%, 20%, etc mean? I never understood that part.

It's the size of the white square they display while testing brightness in relation to the screen size. So a 50% window takes half the area of the screen.

You test for peak nits in the 2-10% range. Brightness drops off after that, more so with OLEDs.
 

TLZ

Banned
It's the size of the white square they display while testing brightness in relation to the screen size. So a 50% window takes half the area of the screen.

You test for peak nits in the 2-10% range. Brightness drops off after that, more so with OLEDs.
Thanks for the response. But why 2-10% specifically? What's the science there? Shouldn't we be testing the whole screen since that's what we're actually watching?
 
Last edited:

vpance

Member
Thanks for the response. But why 2-10% specifically? What's the science there? Shouldn't we be testing the whole screen since that's what we're actually watching?

They basically do this test for HDR. When you see it in games and movies, HDR highlights are usually small bright areas like the sun or a headlight. So by testing small window sizes you get an idea of how bright they’ll be on screen.
 
Last edited:
  • Thoughtful
Reactions: TLZ

TLZ

Banned
They basically do this test for HDR. When you see it in games and movies, HDR highlights are usually small bright areas like the sun or a headlight. So by testing small window sizes you get an idea of how bright they’ll be on screen.
Ah makes sense. Thanks for explaining ☺️
 

Elios83

Member
MiniLED TVs basically represent the beginning of the transition in the consumer market towards a total inorganic technology that will eventually converge with MicroLED and per pixel backlight control using inorganic LEDs in a few years.
Once that happens we'll have the best of the two worlds ( OLED vs QLED) in the same product and I expect OLED technology to be slowly sidelined because of its reliability and low brightness issues.
In the meantime MiniLED TVs seem positioned to be excellent compromises that will feature both the typical QLED advantages while closing the gap with OLED significantly on black levels and black uniformity.
 
Last edited:

Kuranghi

Member
Thanks for the response. But why 2-10% specifically? What's the science there? Shouldn't we be testing the whole screen since that's what we're actually watching?

It used to be a good test of how bright highlights would be in real content but certain manufacturers *coughsamsungcough* starting detecting the pattern - a pure white square on a pure black background, the square taking up 2, 10, 25 and 50% of the screen to simulate different sizes of bright elements like, 2% would be for small bright specular reflections off chrome or the like, or the sun, while the 25 and 50 squares are more for when there are large bright elements on screen, like the ending "basement" scene of Annihilation or the bit in the Matrix where morpheus is in the armchair in the white void with Neo - and abandoning all balance of the image to make the contrast as strong as possible and the white square as bright as possible to game the tests for reviews.

So thats why rtings created the "real scene" test to show how it will produce real world content rather than just a test pattern, read whats in the yellow/golden box here - https://www.rtings.com/tv/tests/picture-quality/peak-brightness - also look at the weighting for scores, its 63% for real scene brightness, ie they know Samsung games the pattern so they are trying to mitigate that with their own more honest test video.

Generally the HDR Real Scene brightness should match up with the 10% window figure if they aren't bullshitting the tests, but sometimes the 25% figure depending on how the TV handles HDR. Not 100% on this but how I understand it is: if the Real Scene brightness is similar to the 10% window brightness then the TV will make smaller objects brighter but the overall image will on average be less bright, while TVs that match the Real Scene brightness to the 25% window brightness will have higher APL (average picture level) and therefore the non-highlight parts (sun, reflections, pure white things) and shadows will be brighter/show more detail (As long as the source isn't overriding that happening).

The reason Samsung started doing this is so they can put a sticker on the TV that says "HDR 2000" or whatever number, they are claiming it goes to 2000 nits, but the issues are these:

* The test doesn't specify real world content, just test patterns
* It doesn't need to do it for more than a couple of seconds before it can drop to a lower value, so you could be looking at the sun in a film for 4 seconds and it will significantly dim after a second or two ( before the shot ends) on a Samsung, whereas a Sony will hold the highest peak brightness for about 20 seconds, as an example, but more likely for several minutes before dimming, ie longer than any average shot in a normal film would ever be. A shot of something uber bright like the sun is rarely going to be more than ~10 seconds as well.
* You don't know how the TV is setup when they do the test, it probably has every possible settings that increases contrast on and the if you showed real content on it in that state it would look like cartoony trash.

So to answer the question, rtings do test the whole screen by using the Real Scene test, here is the SDR version, there is an HDR version linked on the page above but you have to download it to play it properly, most easily on your TVs internal media player, this is the same video just in SDR so you get the idea anyway:




I don't think rtings is good for evaluating subjective stuff like upscaling quality, tonemapping or motion interpolation but the brightness tests are pretty objective imo, so a great resource for cutting through manufacturer bullshit. This is why Sony mid-range LCDs smashing "high-end" Samsung LCDs for HDR picture quality even when the spec sheet would imply the Samsung would smash the Sony. They claimed "HDR 1000" for the Q60T and its fucking 450 nits, seawards. The XF90 was 950 nits and cost the same or like £100 more at most.
 
Last edited:
  • Fire
Reactions: TLZ

Guy Legend

Member
New back light tech is promising.

However this tv is lacking. Lack of multiple hdmi 2.1 ports, no dolby atmos or dolby vision. Shouldn't be the case for a top of the line tv.
 

TLZ

Banned
It used to be a good test of how bright highlights would be in real content but certain manufacturers *coughsamsungcough* starting detecting the pattern - a pure white square on a pure black background, the square taking up 2, 10, 25 and 50% of the screen to simulate different sizes of bright elements like, 2% would be for small bright specular reflections off chrome or the like, or the sun, while the 25 and 50 squares are more for when there are large bright elements on screen, like the ending "basement" scene of Annihilation or the bit in the Matrix where morpheus is in the armchair in the white void with Neo - and abandoning all balance of the image to make the contrast as strong as possible and the white square as bright as possible to game the tests for reviews.

So thats why rtings created the "real scene" test to show how it will produce real world content rather than just a test pattern, read whats in the yellow/golden box here - https://www.rtings.com/tv/tests/picture-quality/peak-brightness - also look at the weighting for scores, its 63% for real scene brightness, ie they know Samsung games the pattern so they are trying to mitigate that with their own more honest test video.

Generally the HDR Real Scene brightness should match up with the 10% window figure if they aren't bullshitting the tests, but sometimes the 25% figure depending on how the TV handles HDR. Not 100% on this but how I understand it is: if the Real Scene brightness is similar to the 10% window brightness then the TV will make smaller objects brighter but the overall image will on average be less bright, while TVs that match the Real Scene brightness to the 25% window brightness will have higher APL (average picture level) and therefore the non-highlight parts (sun, reflections, pure white things) and shadows will be brighter/show more detail (As long as the source isn't overriding that happening).

The reason Samsung started doing this is so they can put a sticker on the TV that says "HDR 2000" or whatever number, they are claiming it goes to 2000 nits, but the issues are these:

* The test doesn't specify real world content, just test patterns
* It doesn't need to do it for more than a couple of seconds before it can drop to a lower value, so you could be looking at the sun in a film for 4 seconds and it will significantly dim after a second or two ( before the shot ends) on a Samsung, whereas a Sony will hold the highest peak brightness for about 20 seconds, as an example, but more likely for several minutes before dimming, ie longer than any average shot in a normal film would ever be. A shot of something uber bright like the sun is rarely going to be more than ~10 seconds as well.
* You don't know how the TV is setup when they do the test, it probably has every possible settings that increases contrast on and the if you showed real content on it in that state it would look like cartoony trash.

So to answer the question, rtings do test the whole screen by using the Real Scene test, here is the SDR version, there is an HDR version linked on the page above but you have to download it to play it properly, most easily on your TVs internal media player, this is the same video just in SDR so you get the idea anyway:




I don't think rtings is good for evaluating subjective stuff like upscaling quality, tonemapping or motion interpolation but the brightness tests are pretty objective imo, so a great resource for cutting through manufacturer bullshit. This is why Sony mid-range LCDs smashing "high-end" Samsung LCDs for HDR picture quality even when the spec sheet would imply the Samsung would smash the Sony. They claimed "HDR 1000" for the Q60T and its fucking 450 nits, seawards. The XF90 was 950 nits and cost the same or like £100 more at most.

Oh man. You absolutely answered every question in my head in great detail. My brain is totally satiated now :D

That real scene test part is great btw. Makes more sense to me, that's why I asked what I asked.

Good info about Samsung's shiftiness too.
 
What exact model did you have before? Its on the back of the TV, on a grey or white sticker. Can give you an idea of how much of an upgrade it will be.
LG CX just arrived, absolutely gorgeous. Plenty bright for the daytime, but I'm really looking forward to playing some Demon's Souls tonight when the sun goes down.

Hate to enlist you as tech support but. . .I've got a problem with my stupid soundbar (TCL TS7010), which is connected via the HDMI ARC port to the TV. The TV will turn it on and off like it's supposed to, but when I turn everything on the soundbar isn't recognized - the TV defaults to internal speakers and I have to manually switch it. From Googling the issue I've tried turning off QuickStart+ on the TV, unplugging the TV from the wall for a minute, etc, but nothing seems to work except carefully turning the soundbar on BEFORE the TV.

That's not all that onerous a task, but still, I'd prefer to only keep one remote around. Any ideas as to how to get the TV to properly default to the soundbar when I turn it on?
 

RJMacready73

Simps for Amouranth
I didnt say it wasn't good for movies and TV in general by virtue of being an OLED, the picture quality is phenomenal and the differences in upscaling and presentation between LG and Sony OLEDs are not visible to the averager consumer, I'm just talking about the motion. The way motion is handled on OLEDs means if you watch lots of 24/25hz content there will be a noticeable stutter which makes most people want to put on motion interpolation to smooth out the motion without adding artifacts, and Sony does that much better than LG, any reviewer will tell you that, its been like that for 5+ years.

If you watch movies and TV with motion interpolation off then its not a problem but I found hardly anyone of my customers wanting to do that, from average joes to real enthusiasts, most want to turn it on. So I always mention it for that reason.

Do you not use motion interpolation on your CX? If yes, then try this video on it and tell me if you see any tearing during the normal speed parts:




TBH if you have the smoothing set really high then don't watch the video, its a torture test too see how good your motion interpolation is. I mean it sincerely it will just be annoying going forward if you are loving the TV, I'm not trying to rip on your TV just talking about my experience with these sets looking at them for 100s of hours.

edit - I'm not 100% how it works on LGs but the apps on the TV might be in a different picture mode than what you are used to so make sure to check that and set the motion to what you normally use on HDMI inputs.


When I bought the TV I had a whole host of settings videos ready to go through and set the thing up and when I plonked it onto cinema with all that motion shite off, the picture looked incredible so didn't even bother tweaking it further. I just watched that video you posted and again it looked great, very bright, very colourful and noticed no stuttering or tearing, everything looked nice, smooth and natural.

Last year when I was choosing between this and a Q90 Sammy one of the things that worried me was reading about this stuttering on certain content and to date I've not noticed it and I've deliberately looked for it, especially when we went through the TLOR remasters, it has loads of long panning shots and again smooth and with incredible HDR highlight pops and those deep inky blacks making the whole mines of Moria scene a joy to rewatch.

It's a top top telly, can't fault it.
 

Kuranghi

Member
When I bought the TV I had a whole host of settings videos ready to go through and set the thing up and when I plonked it onto cinema with all that motion shite off, the picture looked incredible so didn't even bother tweaking it further. I just watched that video you posted and again it looked great, very bright, very colourful and noticed no stuttering or tearing, everything looked nice, smooth and natural.

Last year when I was choosing between this and a Q90 Sammy one of the things that worried me was reading about this stuttering on certain content and to date I've not noticed it and I've deliberately looked for it, especially when we went through the TLOR remasters, it has loads of long panning shots and again smooth and with incredible HDR highlight pops and those deep inky blacks making the whole mines of Moria scene a joy to rewatch.

It's a top top telly, can't fault it.

Yeah, I'm a big fan of motion shite off for everything.

What content did they say it stutters on?
 

llien

Member
There's also a field of technical research being done on OLED called Plasmonics which aims to greatly reduce, if not wipe out the annoyance of OLED, likely denser OLED particles and new material sciences will eventually make OLED even cheaper and more robust, allowing for better brightness and colour volume moving forward, but brightness and colour are said to be much better on the new TVs featuring LG's new Panel tech.
ASUS has curiously cheap OLED notebooks inbound, 13" 1080p, could be orderd today, delivery promised in May:

German pricing on Intel variants starts at 1099 (no info on Ryzens)
 

RJMacready73

Simps for Amouranth
Yeah, I'm a big fan of motion shite off for everything.

What content did they say it stutters on?
long panning shots, something to do with the framerates but i haven't noticed it with anything, watching movies via the inbuilt player connected to my server, movies streaming from Kodi, movies on the telly etc
 

Kuranghi

Member
long panning shots, something to do with the framerates but i haven't noticed it with anything, watching movies via the inbuilt player connected to my server, movies streaming from Kodi, movies on the telly etc

Ah right, I think thats "telecinic judder", but that only matters for people who have the motion stuff on. If you have it off 24hz content will be displayed correctly, you need specific settings when the motion stuff is on to make sure it doesn't have a little skip every now and then. You're golden because you avoid the motion stuff.
 

RJMacready73

Simps for Amouranth
Ah right, I think thats "telecinic judder", but that only matters for people who have the motion stuff on. If you have it off 24hz content will be displayed correctly, you need specific settings when the motion stuff is on to make sure it doesn't have a little skip every now and then. You're golden because you avoid the motion stuff.
i dont even know why people would have that horrible shit on tbh, the picture looks incredibly fake
 

MrSec84

Member
ASUS has curiously cheap OLED notebooks inbound, 13" 1080p, could be orderd today, delivery promised in May:

German pricing on Intel variants starts at 1099 (no info on Ryzens)

I meant to type "annoyance of Burn-In, in OLEDs" in that section of my post, I miss-typed.
Over time just like any technology it becomes more heavily manufactured and costs will come down further, Plasmonics should make it possible for the life of panels and ability for the technology to maintain higher peak and full-field brightness for longer.

LG have already been able to increase the brightness of their panels going to higher end TVs for the 2021 models from Sony, Panasonic and their sister company LG Electronics, I believe Philips are also using them.
These panels have replaced traditional Hydrogen Molecules, with Deuterium, which can allow for 20% better brightness, plus there is an additional Green Layer and Blue, which is why we've been seeing Sony able to achieve 1300 nits peak brightness in their Vivid Picture settings.
Sony like Panasonic are starting to use a dedicated Heatsink layer in their top end OLED.

TBH I'm not personally too interested in Notebooks, I have heard about the rumours of Switch Pro potentially using OLED panels, I hope Nintendo adopts HDR and good brightness displays, with better blacks and contrast performance, VRR and DLSS, along with Variable Rate Shading would be great, but I guess that's off topic.
OLED is too, but Mini-LED could also be present and that technology has great potential too.
 

MrSec84

Member
But ips glow is still there i bet.
Or are mini led va?
LG are using IPS I believe, they claim 1,000,000:1 constrast ratios, which is the same as the Hisense Mini-LED Active Matrix TVs.
We'll have to wait to see in tests how blacks and real world contrast performance are in the LG Nano QNED, presumably it will be vastly improved over traditional FALD LED IPS models, they are using Quantum Dots for better colour volume and Nanocell which may be used to help control blooming, plus the whole use of much more dimming zones should have a noticeable effect on light control.

I still think VA will have the better contrast performance, because you'll get an amplification of existing VA benefits, but Mini-LED should narrow differences between two LCD Panel technologies.
Having great dimming algorithms and overall image processing will make a big difference too, co-ordinating all of the technologies in use.
 

Observadorpt

Neo Member
Hi

I Dont know if anyone can help me, I own a Samsung QN95A (QN90A in US) when i choose in my xbox series X or PS5 Dolby Digital the sound have a delay regarding image, but when i choose stereo uncompressed or 5.1 , 7.1 uncopressed the sound are ok but my logitech z.5500 5.1 soundsystem only stay in 2.0 sound (in dolby do 5.1 but with the delay), anyone have this issue that knows a fix for this?
 
Last edited:

DeepEnigma

Gold Member
Hi

I Dont know if anyone can help me, I own a Samsung QN95A (QN90A in US) when i choose in my xbox series X or PS5 Dolby Digital the sound have a delay regarding image, but when i choose stereo uncompressed or 5.1 , 7.1 uncopressed the sound are ok but my logitech z.5500 5.1 soundsystem only stay in 2.0 sound (in dolby do 5.1 but with the delay), anyone have this issue that knows a fix for this?
I assume you are using HDMI?

This is a known issue with Dolby and DTS over HDMI. Sound lag. eARC was supposed to alleviate that with predictive speech syncing or whatever they called it, but so far has been a huge disappointing dud.

Some audio systems sync the voice over HDMI in Dolby/DTS better than others (depending how fast the processing is). But when it's bad, it's bad... and sometimes can progress worse over time, where you have to resync the audio system (turn it off or on, or switch from TV audio back to sound bar or audio system) to clean it up.
 

Ulysses 31

Member
Hi

I Dont know if anyone can help me, I own a Samsung QN95A (QN90A in US) when i choose in my xbox series X or PS5 Dolby Digital the sound have a delay regarding image, but when i choose stereo uncompressed or 5.1 , 7.1 uncopressed the sound are ok but my logitech z.5500 5.1 soundsystem only stay in 2.0 sound (in dolby do 5.1 but with the delay), anyone have this issue that knows a fix for this?
Have you set sound format to Pass Through in sound options? That eliminated all sound delays for me.
 

Observadorpt

Neo Member
I assume you are using HDMI?

This is a known issue with Dolby and DTS over HDMI. Sound lag. eARC was supposed to alleviate that with predictive speech syncing or whatever they called it, but so far has been a huge disappointing dud.

Some audio systems sync the voice over HDMI in Dolby/DTS better than others (depending how fast the processing is). But when it's bad, it's bad... and sometimes can progress worse over time, where you have to resync the audio system (turn it off or on, or switch from TV audio back to sound bar or audio system) to clean it up.
Hi

input source is HDMI (XBOX Series X and PS5) and out via optical .
Have you set sound format to Pass Through in sound options? That eliminated all sound delays for me.
Hi

Passthrought is greyed out, only to arc, my sound system is via optical :(, i guess my only option is to buy a sound bar.
 
Top Bottom