• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

The Beard

Member
Samsung gonna get mopped up hard in 2017.

- There QLED development plans are a mess and they have no idea when the new display tech will be ready.

Sony planning to release OLED displays (using 2017 LG Panels) in 2017.

LG's panels for 2017 are gonna be stunning with better HDR support & maybe even HFR (I've seen rumours of this for months and months now so we find out soon)

Panasonic will have another OLED display ready for Europe, Australia and so on too. (I doubt U.S will get it)

Lol, naw.

Samsung will be fine as far as sales go. Most people don't give 2 shits about premium PQ. They just want the biggest, brightest, thinnest, & cheapest LCD they can find. Samsung does great in that market.
 

sibarraz

Banned
Which are the best 4k tv in the market with the less input lag and that is also cheap? I heard that samsung tvs were good
 

dmr87

Member
I'm really thirsting over LGs OLED TVs but I'm gonna hold off until at least Q3/4 next year, more brands getting into the market, better prices, general technology improvements and better TVs overall.

My Panasonic plasma is still up there when it comes to quality so I'm good for another year or so. My only real problem with it is that it's only 55".
 

Paragon

Member
More likely 100% P3
A lot of displays are managing complete P3 coverage with white backlighting going through regular color filters. The iPhone 7 manages this.
Samsung are said to be using blue LED backlighting with RGB quantum dot color filters in their 2017 displays, so I don't think the 98% BT.2020 coverage that has been reported is unrealistic.
100% P3 coverage is only 70-75% BT.2020 coverage, so it would be a bit disappointing if LG haven't really progressed much since last year, especially as the rumor was that they would be moving to an RGB stack instead of Blue+Yellow.

The power consumption was improved by 20% and LG will be able to ship OLED TVs that feature 1,000 nits of brightness in 2017.
I wonder if they'll be able to hit 200 nits full-screen brightness this year with these new efficiency improvements.

Samsung gonna get mopped up hard in 2017.
They're probably going to have the brightest and widest gamut displays next year, and the highest contrast LCDs you can actually buy. I wouldn't count them out yet.

LG's panels for 2017 are gonna be stunning with better HDR support & maybe even HFR (I've seen rumours of this for months and months now so we find out soon)
They were demoing HFR OLEDs this year, but what we don't know is if they'll actually include 120Hz support via HDMI/DisplayPort connections.
And HFR support is really wasted without also including support for Adaptive-Sync.
An OLED with 30-120Hz Adaptive-Sync support would be a sight to behold, and be a good alternative solution for their motion handling problems - opt for having the smoothest ghost-free motion, instead of the clearest sharpest motion.
 
Adaptive Sync isn't coming to TVs.

Samsung gonna get mopped up hard in 2017.

- There QLED development plans are a mess and they have no idea when the new display tech will be ready.

Sony planning to release OLED displays (using 2017 LG Panels) in 2017.

LG's panels for 2017 are gonna be stunning with better HDR support & maybe even HFR (I've seen rumours of this for months and months now so we find out soon)

Panasonic will have another OLED display ready for Europe, Australia and so on too. (I doubt U.S will get it)

Sure, if by mopped up you mean the largest market share again for like 10 years running, the largest selection of products, and the largest share of profits by far since TVs are a commodity product and you make money by selling many of them at a low price, not selling a tiny number at a higher price.

Considering Samsung will have the widest color gamut, the brightest HDR, and the second-best contrast ratio (only Sony does better, and only if you pony up for the Z9D), I think they'll be fine in 2017. If they achieve 100% Rec.2020 it will be a significant accomplishment. OLED will continue to be a miniscule niche like plasma was, which is why Sony are going to have plenty of LCD to sell alongside whatever putative OLED TV they bring to market in 2017.
 

Paragon

Member
Adaptive Sync isn't coming to TVs.
If there isn't at least one TV with support for it next year, then the TV manufacturers are idiots.
Sony especially, since they could support it on PS4/Pro with a firmware update and have most of the market to themselves if no-one else supports it.
Adaptive-Sync is mostly software, and it's probably the most important upgrade that they could make for gamers.
 

Theonik

Member
If there isn't at least one TV with support for it next year, then the TV manufacturers are idiots.
Sony especially, since they could support it on PS4/Pro with a firmware update and have most of the market to themselves if no-one else supports it.
Adaptive-Sync is mostly software, and it's probably the most important upgrade that they could make for gamers.
Adaptive sync for HDMI is something that AMD made and even if they are offering it to the HDMI Forum is not necessarily something that they are interested in including. There is less benefit for video than there is for games and people are underestimating the effort involved. Especially when it comes to audio sync which HDMI needs to deal with but DP does not.

Arbitrary refresh technology right now also does not support truly arbitrary refresh rates with minimum being upwards of 40 usually with frames being repeated to maintain the screen refresh. Video content does not vary its framerate so they can match it if they wished but it's a massive pain to implement something like this.
 

Paragon

Member
Adaptive sync for HDMI is something that AMD made and even if they are offering it to the HDMI Forum is not necessarily something that they are interested in including. There is less benefit for video than there is for games and people are underestimating the effort involved.
It is currently implemented using vendor-specific extensions, which makes it "legal" within the HDMI standard.
There's nothing preventing it from being supported in 2017 TVs.

A number of monitors received Adaptive-Sync support via firmware updates - even really cheap "eBay specials" from companies you've probably never heard of.
If they can manage it, big companies like Sony/Samsung/LG shouldn't have any difficulty supporting it.

rtg20tech20summit20-2nkxh2.jpg

rtg20tech20summit20-2c9ysd.jpg

Look at the companies who are already on-board: LG and Samsung.
Samsung have already released monitors supporting this. I'm not sure if LG have released any yet, though they do have regular Adaptive-Sync displays.

There's no reason they shouldn't support this on their TVs.

Especially when it comes to audio sync which HDMI needs to deal with but DP does not.
DisplayPort is not a video-only interface.

Arbitrary refresh technology right now also does not support truly arbitrary refresh rates with minimum being upwards of 40 usually with frames being repeated to maintain the screen refresh. Video content does not vary its framerate so they can match it if they wished but it's a massive pain to implement something like this.
Most displays start at 30Hz now rather than 40Hz, but either is fine.
So long as your maximum refresh rate is at least 2–2.5x your minimum, you can repeat frames when the framerate drops below your minimum refresh rate.
So if your display doesn't support 24Hz you repeat each frame and display at 48Hz instead.

OLEDs shouldn't have the minimum refresh rate limitations that LCDs do, so they could theoretically handle everything from 1–120Hz, which the Adaptive-Sync spec supports.
The problem with LCDs is that the image fades the longer you try to hold it without updating the panel.

The lack of variable refresh rate support is the reason why there is no home version of the HFR Hobbit films.
There is no widespread support for 48Hz and it would probably require new players and displays - or at least firmware updates for both.
If we had variable refresh rate support from the start, you wouldn't have to think about it.
They'd just deliver a 48 FPS video and it would just work.

If "HFR" support just means that TVs now support 24/50/60/100/120Hz inputs instead of 24/50/60Hz, that will be really disappointing.
Now is the ideal time to support variable refresh rates - whether that's Adaptive-Sync-over-HDMI or some other standard. Support 40-120Hz on LCDs and 1-120Hz on OLEDs.
If a director wants to release a film at 72 or 96 FPS, or any other framerate they desire, they should be able to.
It's really short-sighted of them if HFR support just means two more fixed refresh rates being added, instead of moving to variable refresh rates.
And while video currently gets released at fixed frame rates, VFR encoding could help save bandwidth for streaming services, allowing them to deliver higher quality.
 

Theonik

Member
It is currently implemented using vendor-specific extensions, which makes it "legal" within the HDMI standard.
There's nothing preventing it from being supported in 2017 TVs.

A number of monitors received Adaptive-Sync support via firmware updates - even really cheap "eBay specials" from companies you've probably never heard of.
If they can manage it, big companies like Sony/Samsung/LG shouldn't have any difficulty supporting it.



Look at the companies who are already on-board: LG and Samsung.
Samsung have already released monitors supporting this. I'm not sure if LG have released any yet, though they do have regular Adaptive-Sync displays.

There's no reason they shouldn't support this on their TVs.
If it's not part of the HDMI standard there is absolutely no reason to support it and doing so will only cause more headaches. It's 'legal' under the HDMI standard but will not work with standard devices which defeats the purpose of the standard. Unless the HDMI forum adopts it, it's useless.

Video sync is a significant and difficult to solve issues with variable refresh rate. And you shouldn't need to support adaptive sync to do arbitrary input the display rate doesn't have to change. And with regards to upgrading equipment, storage and the entire chain that's the same issue you will have either way. It's not a simple change and requires a lot of design into the whole pipeline.

e: VFR would lead to tiny efficiency improvements in encoding if any. Repeating frames are very efficiently encoded under any modern codec to the point of being irrelevant. The overhead of storing the VFR information would likely outstrip the savings.
 

Paragon

Member
If it's not part of the HDMI standard there is absolutely no reason to support it and doing so will only cause more headaches. It's 'legal' under the HDMI standard but will not work with standard devices which defeats the purpose of the standard. Unless the HDMI forum adopts it, it's useless.
And I'm saying that if they are going to introduce HFR into the HDMI standards without VRR support, it's a big mistake.
When LG and Samsung are already making displays which support this, there's no reason they couldn't include support in their TVs either, even if it's AMD's FreeSync-over-HDMI implementation as a vendor-specific extension.
This was AMD's approach with DisplayPort too. They got working Adaptive-Sync implementations and the DisplayPort Forum ratified it. The two are interchangeable.
Sony would have the most to gain by being first support it, since they could patch in support on the PS4/Pro and would have that entire market to themselves for a while.

Manufacturers are starting to pay more attention to gaming, and this is probably the most important change that any of them could make for it.
 

Theonik

Member
You are making the assumption that these manufacturers could make these changes with a software change which is a pretty big unknown at the moment. At the very least there is a lot of devices users could have in their display chain that won't be updated even if it is possible.

TVs and monitors use different display implementations too and have very different processing needs. We'll see how this is tackled.
 

Paragon

Member
You are making the assumption that these manufacturers could make these changes with a software change which is a pretty big unknown at the moment.
I'm saying that it's been done before with a number of displays now.
Whether it can be added to existing displays via firmware - and I'm pretty confident that it could if they wanted to - is less important than adding support for it in their new displays, especially as they add HFR support.

At the very least there is a lot of devices users could have in their display chain that won't be updated even if it is possible.
The same could be said about HFR support.
Even if the display supports it, the source has to as well.
The important thing is that the display has to support it before sources can.
 

AddiF

Member
Hi guys

Need help here as I'm in a bit of a dilemma.

LG 65" OLED E6V is on an almost $900 discount at one place here and I really, really want it although I've read about and seen on youtube, plenty of mentions and cases of image retention although not permanent (right?) and some input lag in HDR (firmware update solved that?).

My dilemma is... should I jump into the E6 pool or will I regret it for not waiting for 2017 OLEDs, the E7? What's on the horizon? Will it make me regret the purchase or is it likely that the upgrade is minimal? I know CES is just around the corner but when do the new LGs get released?

Jump or wait? I'm nervous as hell so I hope you can help me out here
 

NYR

Member
Hi guys

Need help here as I'm in a bit of a dilemma.

LG 65" OLED E6V is on an almost $900 discount at one place here and I really, really want it although I've read about and seen on youtube, plenty of mentions and cases of image retention although not permanent (right?) and some input lag in HDR (firmware update solved that?).

My dilemma is... should I jump into the E6 pool or will I regret it for not waiting for 2017 OLEDs, the E7? What's on the horizon? Will it make me regret the purchase or is it likely that the upgrade is minimal? I know CES is just around the corner but when do the new LGs get released?

Jump or wait? I'm nervous as hell so I hope you can help me out here
E6 owner here.

There are no image rentention issues. You literally have to leave a static image on screen for HOURS in order to even be at risk. It should not influence your decision. There are also ways to clear said image retention so I would not be concerned.

In regards to input lag, firmware 04.30.19 added HDR Game Mode which improves the input lag. I'm a hardcore gamer, have everything, I have no problems with the current input lag values, yes, there are better TVs with better input lag numbers, but nothing that comes close picture wise.

I personally would wait to see what the E7 has to offer, it is only a few weeks. The biggest thing to watch for is brightness levels - the nit value is rumored to be 1000, which would be a big improvement over the E6, The biggest issue with OLEDs is brightness levels.

There is no way the E6 will be selling for retail by then especially with the new model announced, you should be able to find it somewhere for that price or negotiate it.
 

mrklaw

MrArseFace
I wish they'd bring out an updated ARC v2.0 to support higher quality audio from the TV to a receiver? With smart TVs often being the easiest way to watch eg 4K Netflix etc, audio still limited to DD5.1 needs addressing
 

AddiF

Member
E6 owner here.

There are no image rentention issues. You literally have to leave a static image on screen for HOURS in order to even be at risk. It should not influence your decision. There are also ways to clear said image retention so I would not be concerned.

In regards to input lag, firmware 04.30.19 added HDR Game Mode which improves the input lag. I'm a hardcore gamer, have everything, I have no problems with the current input lag values, yes, there are better TVs with better input lag numbers, but nothing that comes close picture wise.

I personally would wait to see what the E7 has to offer, it is only a few weeks. The biggest thing to watch for is brightness levels - the not value is rumored to be 1000, which would be a big improvement.

There is no way the E6 will be selling for retail by then especially with the new model out, you should be able to find it somewhere for that price.

Thank you for your reply. Good to hear about IR and HDR lag. I'm leaning toward waiting to see what E7 is and does. Do you know when LG launches them?
 

NYR

Member
Thank you for your reply. Good to hear about IR and HDR lag. I'm leaning toward waiting to see what E7 is and does. Do you know when LG launches them?
Announced at CES in early January, should be out in April at the latest in the US. LG lags in Canada, you have to wait until June if you want one in Canada.
 

AddiF

Member
Announced at CES in early January, should be out in April at the latest in the US. LG lags in Canada, you have to wait until June if you want one in Canada.

Iceland here. So I'm guessing late 2017 here ha! ... ugh...
 

jstevenson

Sailor Stevenson
yeah no image retention problems.

I just left my E6 on a pause menu with text and stuff for a couple hours.

nothing visible once I resumed Dishonored.

HDR input lag is fixed now too.

Of course, if you want to wait, the discounts will probably continue and get steeper when the 2017's get announced in a week or two, so don't feel like you'll never see that discount again
 

Kieli

Member
Was out and about on Boxing day and saw some HDR TVs. Have to say, I'm not that impressed.

Maybe it was the content they decided to show or the ambient lighting of the environment, but the image didn't look all that much more colourful than any of the other TVs I saw. Well, except that the darks were darker.

If big box stores with access to all the equipment and expertise to provide the best possible use-case scenario underwhelmed, I'm not sure daily usage with modestly mastered content is going to fare any better.
 

The Beard

Member
Was out and about on Boxing day and saw some HDR TVs. Have to say, I'm not that impressed.

Maybe it was the content they decided to show or the ambient lighting of the environment, but the image didn't look all that much more colourful than any of the other TVs I saw. Well, except that the darks were darker.

If big box stores with access to all the equipment and expertise to provide the best possible use-case scenario underwhelmed, I'm not sure daily usage with modestly mastered content is going to fare any better.



Lol

Big box stores in general are fucking terrible at setting up displays. Except Best Buy with a Magnolia center, they can do a decent job when they have TV's playing a BluRay in a darkened room.

TVs on the showroom floor in torch-mode are not good representations of PQ.
 
Hey guys, I was wondering if anyone has any recommendations for a gaming monitor. Specifically I am looking for a monitor with a minimum amount of input lag (lowest possible really) and maximum motion clarity. I mainly play Street fighter 5, Overwatch and Team fortress 2. Colors and viewing angles aren't important. I currently am using a BenQ GL2450-B 60 hz and I am very pleased by the input lag but not the motion clarity. There is significant ghosting/trailing (pixel response time?), it gets pretty blurry when swiping around in Overwatch. So I'd like to upgrade to one of the clearest ULMB 144hz+ monitor while keeping input lag to a minimum. Also gsync is a plus but not needed as it adds input lag from what I read. Any suggestions besides "get a CRT" :p ?
 

J-Rzez

Member
Samsung gonna get mopped up hard in 2017.

- There QLED development plans are a mess and they have no idea when the new display tech will be ready.

Sony planning to release OLED displays (using 2017 LG Panels) in 2017.

LG's panels for 2017 are gonna be stunning with better HDR support & maybe even HFR (I've seen rumours of this for months and months now so we find out soon)

Panasonic will have another OLED display ready for Europe, Australia and so on too. (I doubt U.S will get it)

Samsung will continue to drudge on with their irrelevant, useless, low-mid range TV schemes. Like their 5200, 5201, 6200. Then with their 6270, 6290, 6300, and "why" 7-series. They will continue to sell based on name to some, and aesthetic to the "i want that curve" people who care more about how a TV looks than it's PQ. They will also make some "good" sets at good prices too, like the 8000.

Samsung will however, lose premium space. I'd expect OLED to start eating more and more of that market as their prices continue to drop. And remember, in time, OLED manufacturing is expected to be cheaper than LED-LCD tech. LG just needs to work on some key areas I think, like motion. While better than samsung's, LG's motion isn't on par with Sony's.

If someone is going to win next year it's going to be Sony if they have those OLEDs at the prices expected of MSRP $2000 55" and $3000 65". They will be that price for a couple weeks, then "sale" price hit and we may see the 55" hit the $1200-1500 mark, and they will go even faster.

I too am in the watch closely camp for their OLEDs. Sony does have some sort of knack for judder free motion, and non-intrusive motion tech. And I expect better PQ controls from Sony as well. The judder in my 65EF9500 is one of the things I'm not happy with, and lower input lag will be nice as well. As long as it turns out to my tempered expectations, I will retire my LG to another room, and a Sony will take the main spot.

1000nits from an OLED with 99% CS huh? That'll be interesting.
 

NYR

Member
Iceland here. So I'm guessing late 2017 here ha! ... ugh...
Woah, haha, yeah, might be wise to lean towards that E6 then, but the U.K. Is a known market for TVs, they get decent supply so you could always have one shipped from there for a reasonable price.

I would actually go to the retailer and ask them when they got the E6 in stock, will give you a great hint at when you can expect to see an E7 in stock.

According to a quick google search, LG held an OLED event in Iceland in June 2016 so that would be about the time Canada got them as well:

http://www.lg.com/sg/press-release/lg-oled-tv-brings-northern-lights-to-iceland-this-summer
 

holygeesus

Banned
Thank you for your reply. Good to hear about IR and HDR lag. I'm leaning toward waiting to see what E7 is and does. Do you know when LG launches them?

Unless the B6 is different to the E6, in terms of image retention, it *is* an issue with these sets, and you only need to leave static images on for seconds to see it. For example, play something like The Witness in HDR mode, and by the time it takes you to solve a puzzle, resuming the game will leave the box on the screen for a few seconds.

It is *not* permanent, but you don't have to leave bright objects on screen for that long, for them to leave their (temporary) mark.

If I was you, I would wait a few months, as the E6 is a hell of an outlay, for a set that will be bettered within a few months. I couldn't be happier with my B6 though, and it will need to be a significant leap for me to upgrade to a 2017 model.
 

Caayn

Member
Purely anecdotal: Owner of a 2015 OLED here (LG 55EG920v to be precise, 2015's B6). I've got zero issues with burn-in on my set whereas my Panasonic VT60 showed trouble keeping the image IR/Burn-in free under the same usage pattern. IR during gaming (with and without static HUDs) and movies are also a non-issue, I can't comment on TV usage as I haven't watched cable TV in years. The only time when it shows IR(for a few seconds) during my usage is on the XB1's game library in the system's dashboard.
Samsung gonna get mopped up hard in 2017.

- There QLED development plans are a mess and they have no idea when the new display tech will be ready.

Sony planning to release OLED displays (using 2017 LG Panels) in 2017.

LG's panels for 2017 are gonna be stunning with better HDR support & maybe even HFR (I've seen rumours of this for months and months now so we find out soon)

Panasonic will have another OLED display ready for Europe, Australia and so on too. (I doubt U.S will get it)
Even if Samsung fails to produce a new high-end and premium sets for the next year they'll continue to sell based on brand recognition alone. The amount of people I see around me owning a product solely because it says Samsung is staggering. They've done a great job building their name.

I'm also wondering if Philips will show their OLED + Ambilight prototype this CES in consumer form.
 

riflen

Member
Hey guys, I was wondering if anyone has any recommendations for a gaming monitor. Specifically I am looking for a monitor with a minimum amount of input lag (lowest possible really) and maximum motion clarity. I mainly play Street fighter 5, Overwatch and Team fortress 2. Colors and viewing angles aren't important. I currently am using a BenQ GL2450-B 60 hz and I am very pleased by the input lag but not the motion clarity. There is significant ghosting/trailing (pixel response time?), it gets pretty blurry when swiping around in Overwatch. So I'd like to upgrade to one of the clearest ULMB 144hz+ monitor while keeping input lag to a minimum. Also gsync is a plus but not needed as it adds input lag from what I read. Any suggestions besides "get a CRT" :p ?

G-Sync mode wont really add any latency that you'll notice. Latency is closer to Vsync off than Vsync on. However, if you're crazy about the best motion resolution possible with an LCD, the most suitable mode is the strobing backlight mode.
G-Sync monitors (mostly) also include the ULMB mode. This provides the best motion resolution, but you will have to use a standard Vsync method to avoid tearing and your PC must maintain frame rates of 85, 100 or 120 fps. These are the Hz modes that ULMB mode operates at for most monitors I've looked at.
Unfortunately ULMB mode is not much use at all for games that are limited to a maximum of 60 fps. Using 120 Hz ULMB mode at 60 fps will look rubbish.

I don't have a model recommendation I'm afraid, as I've not looked at 1920x1080 monitors for some time. I would say though that you should go for a display with an 8-bit TN panel (avoid 6-bit with dithering). Even though we're getting high hertz IPS and even VA panels, they're still behind TN if pixel response is one of the most important things.
In early 2017 there will be a couple of 240 Hz TN monitors from Asus and BenQ. No reviews that I know of yet but the Asus PG258Q will be a G-Sync monitor. The good news for you is that coming from a 60 Hz display, you are in for a real treat when you see strobed 120 Hz / 120 fps.
 

holygeesus

Banned
If you're aiming that post at me: I know ;)

Sorry I think I misread your post first time round. I get the same temporary IR on mine as I did on my previous plasma, in that during 99% of normal usage, I don't see it, but on occasion it is noticeable (see my The Witness example before)
 
G-Sync mode wont really add any latency that you'll notice. Latency is closer to Vsync off than Vsync on. However, if you're crazy about the best motion resolution possible with an LCD, the most suitable mode is the strobing backlight mode.
G-Sync monitors (mostly) also include the ULMB mode. This provides the best motion resolution, but you will have to use a standard Vsync method to avoid tearing and your PC must maintain frame rates of 85, 100 or 120 fps. These are the Hz modes that ULMB mode operates at for most monitors I've looked at.
Unfortunately ULMB mode is not much use at all for games that are limited to a maximum of 60 fps. Using 120 Hz ULMB mode at 60 fps will look rubbish.

I don't have a model recommendation I'm afraid, as I've not looked at 1920x1080 monitors for some time. I would say though that you should go for a display with an 8-bit TN panel (avoid 6-bit with dithering). Even though we're getting high hertz IPS and even VA panels, they're still behind TN if pixel response is one of the most important things.
In early 2017 there will be a couple of 240 Hz TN monitors from Asus and BenQ. No reviews that I know of yet but the Asus PG258Q will be a G-Sync monitor. The good news for you is that coming from a 60 Hz display, you are in for a real treat when you see strobed 120 Hz / 120 fps.

Ty, is there a site somewhere I can find/compare ulmb monitors in terms of motion clarity? Guess I'd cross reference with the input lag monitors database site
 

Paragon

Member
Ty, is there a site somewhere I can find/compare ulmb monitors in terms of motion clarity? Guess I'd cross reference with the input lag monitors database site
TFT Central usually has images in their reviews showing what ULMB is like.
I think the BenQ 2720Z is still the only monitor that has a single-strobe mode which can work at all refresh rates though. ULMB is locked to 85/100/120Hz.
The 2720Z is only a TN panel and doesn't have G-sync though.
 

NYR

Member
Rumors are 1000 nits on 2017 OLEDs? Laaaaaaawwwdddddy
I think it more of what everyone wants and expects. Kind of like wanting an OLED iPhone - people have been asking for that for years so they might finally get it. With OLED TVs, the issue has always been with image brightness across the entire screen, so 1000 nits would accomplish that.

To be honest, I don't even notice the brightness "issue" on my E6 anymore, it really is stinging coming for an Samsung LED, but now it is fine, you just grow accustomed to it.
 

Paragon

Member
I think it more of what everyone wants and expects. Kind of like wanting an OLED iPhone - people have been asking for that for years so they might finally get it. With OLED TVs, the issue has always been with image brightness across the entire screen, so 1000 nits would accomplish that.

To be honest, I don't even notice the brightness "issue" on my E6 anymore, it really is stinging coming for an Samsung LED, but now it is fine, you just grow accustomed to it.
LG's OLED has gone from 500 nits to 800 nits peak, with full-screen brightness remaining at ~150 nits. They made little-to-no progress in that area at all.
If they reach 1000 nits as a result of improved efficiency, I'm expecting maybe 200 nits full-screen brightness at best.
As a reminder, HDR is expected to support 400 nits full-screen brightness, so the OLEDs are a whole stop below that.

For a properly calibrated SDR image this is not a problem at all, since the SDR spec is 100 nits.
However most people seem to run their TVs far brighter than that - especially if they're trying to watch during the day.
My daytime mode has the backlight maxed out for ~400 nits full-screen brightness (it's an older LCD) and that still looks dim on a bright day.

Hitting 1000 nits peak is great, but that doesn't fix the 'brightness problem' that OLEDs have.
They're still going to dim noticeably when large areas of the screen are supposed to be bright.
 

NYR

Member
LG's OLED has gone from 500 nits to 800 nits peak, with full-screen brightness remaining at ~150 nits. They made little-to-no progress in that area at all.
If they reach 1000 nits as a result of improved efficiency, I'm expecting maybe 200 nits full-screen brightness at best.
As a reminder, HDR is expected to support 400 nits full-screen brightness, so the OLEDs are a whole stop below that.

For a properly calibrated SDR image this is not a problem at all, since the SDR spec is 100 nits.
However most people seem to run their TVs far brighter than that - especially if they're trying to watch during the day.
My daytime mode has the backlight maxed out for ~400 nits full-screen brightness (it's an older LCD) and that still looks dim on a bright day.

Hitting 1000 nits peak is great, but that doesn't fix the 'brightness problem' that OLEDs have.
They're still going to dim noticeably when large areas of the screen are supposed to be bright.
I agree with everything you said except the last paragraph. I think OLEDs are already fine so any increase will only be beneficial, I don't expect it to be able to keep up with LCDs in terms of brightness but it is all about trade offs.
 

sector4

Member
The Criterion Collection Blu Ray is already amazing looking at 1080p. I can't wait to see the 4K version. I'm guessing it will eventually come to theaters in the US for some limited release.
Man that sounds awesome, the last time I watched Seven Samurai was on DVD, on an LG CRT haha, that's how long ago it was. I can't wait to see what it'll be like re-scanned and restored to 4K.

One thing I'd really like to see is a black and white film shot to really take advantage of HDR sets crazy dynamic range. Sort of like Sin City. Something that stylised could look really good.

I've watched a few more 4K blu rays recently, and continue to be impressed by the Z9D. The Shallows is a dumb dumb movie, but has some gorgeous shots in it. Really good showcase for Atmos also. The Amazing Spider-Man 2 is really good in 4K also. Tomorrow I'm going to watch either Sicario or TMNT: Out of the Shadows. I hit up the Boxing Day sales and also grabbed Magnificent Seven and Everest on 4K BD, I haven't seen any of those films, so should be fun!
 

Theonik

Member
Black and white films would be quite interesting in HDR I think. They need to be heavily dithered to put in 8-bit colour due to only having 255 shades of gray in that encoding to play around with. They should look much closer to what they were intended on UHD BD.
 

HStallion

Now what's the next step in your master plan?
Man that sounds awesome, the last time I watched Seven Samurai was on DVD, on an LG CRT haha, that's how long ago it was. I can't wait to see what it'll be like re-scanned and restored to 4K.

One thing I'd really like to see is a black and white film shot to really take advantage of HDR sets crazy dynamic range. Sort of like Sin City. Something that stylised could look really good.

I've watched a few more 4K blu rays recently, and continue to be impressed by the Z9D. The Shallows is a dumb dumb movie, but has some gorgeous shots in it. Really good showcase for Atmos also. The Amazing Spider-Man 2 is really good in 4K also. Tomorrow I'm going to watch either Sicario or TMNT: Out of the Shadows. I hit up the Boxing Day sales and also grabbed Magnificent Seven and Everest on 4K BD, I haven't seen any of those films, so should be fun!

I'd say pick up the Criterion Collection version of Seven Samurai. The restoration is amazing. A lot of shots looked like they were filmed yesterday that have been so well cleaned up.
 
Folks, is short term burn in from games bad? I got a new Samsung TV (in particular the UN40KU6300) and after watching a twitch stream I noticed that the speed runner's splits were somewhat visual even on the PS4 home menu. Is this normal and can watching Twitch streams every night lead to long term damage?
 

dallow_bg

nods at old men
Folks, is short term burn in from games bad? I got a new Samsung TV (in particular the UN40KU6300) and after watching a twitch stream I noticed that the speed runner's splits were somewhat visual even on the PS4 home menu. Is this normal and can watching Twitch streams every night lead to long term damage?

Did they go away?
I always thought burn in on LCD screens was permanent. People forget it can happen on them as well.

But you'd have to leave it on a LONG time.
 

BumRush

Member
I think it more of what everyone wants and expects. Kind of like wanting an OLED iPhone - people have been asking for that for years so they might finally get it. With OLED TVs, the issue has always been with image brightness across the entire screen, so 1000 nits would accomplish that.

To be honest, I don't even notice the brightness "issue" on my E6 anymore, it really is stinging coming for an Samsung LED, but now it is fine, you just grow accustomed to it.

The brightness would have more to do with improving the depth and realism of HDR (even further than it is) than it would with a consistent 1,000 nits, right?
 
Top Bottom