• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

The Beard

Member
Come on man, you know damn well that leaving the same static image for 16+ hours a day at 100% brightness and NEVER turning your TV off by using the power button (which runs a compensation cycle) isn't normal TV behavior for 99.99999999% of people. If it is for you, great, don't get an OLED. But please don't act like not doing the above is having to "baby" your TV.

The Best Buy units weren't showing static images for 16+ hours though. People were seeing burn-in from the "LG 4K" logos that would pop up at various times of the demo loop.

It doesn't have to be at 100% brightness to cause issues either (it definitely doesn't help). Some emissive sets get annoying IR even when brightness/contrast is at 60-70%.
 

sector4

Member
I'm struggling to switch from my 100" PJ, even with 4k.

The day draws neigh.
If it helps you at all, I came from a 1080p 109" projector to the 4K 65" LCD, and I'm not missing the size as much as I thought I would. Granted my viewing distance is quite a bit smaller now and the 4K + HDR more than makes up for any size difference. If you're going to be sitting at a similar distance though, 100 -> 65 or even 75 is going to be pretty noticeable.

Have time to watch the show today and I took a picture for comparison. Never mind the potato camera quality but looks like you are actually getting crushed black.
Yeah, and your shot is clipping the highlights, but I know it's a problem with the camera not the display. Even my 5D isn't capable of capturing the full dynamic range of the Z9D without post processing. There was a reason I only showed photos directly out of the camera without editing, because otherwise it becomes a shit show, and I probably would have been accused of editing the pics to show the TV in a better light :p there'd be no transparency. But for arguments sake, here is the exact same photo with the values adjusted to show the detail is absolutely there.

E1bDPv7.jpg


If you compare yours to that, I'd say yours is missing detail in the darker sections of her hair, but it's pointless arguing over compressed jpegs from camera's that aren't equipped to capture these displays properly :p
 

BumRush

Member
The Best Buy units weren't showing static images for 16+ hours though. People were seeing burn-in from the "LG 4K" logos that would pop up at various times of the demo loop.

It doesn't have to be at 100% brightness to cause issues either (it definitely doesn't help). Some emissive sets get annoying IR even when brightness/contrast is at 60-70%.

I've never seen an OLED display unit at a store with IR. All I'm trying to say is that you don't have to "baby" your set, as a lot of posters in here have already mentioned.
 
Is there anyone else beside myself who are interested in OLED partially for their energy savings?

I got rid of a 400 Watt 60-inch LG Plasma from 2009 two months ago. My electricity bill since then went down by 40-50 bucks. Simply by replacing my tv to a 90 Watt monitor.
That's 40-50 bucks more every month. This realization have made me question the power draw of all of my electronics and it has made me really interested in OLED. I don't like the idea that my energy bill gets fucked up because I buy power hungry tech if it doesn't have to be.

My desktop has a 1000 watt powersupply, but the draw seems to not even reach 500 watt at load. It's fucking stupid.



So I've been interested in a 75-inch TV and I looked at the KS8000 and it had a watt draw of 350 watt during operation. That's a lot! Is it HDR that is bringing it down? I then watched the Watt operation on some OLED panels and it was 2-3 times as low. It makes me think that your TV is not really what you pay up front only- It's also the price you pay on your electricity bill.

I thought this was just chump change. I'm floored at how much electricity these things eat!
 

Kyoufu

Member
One of the best things about upgrading from the old 2009 plasma TV was the new TV not generating so much heat to make the room unbearable during the summer. I'm so glad I'm done with plasma forever.
 

The Beard

Member
Is there anyone else beside myself who are interested in OLED partially for their energy savings?

I got rid of a 400 Watt 60-inch LG Plasma from 2009 two months ago. My electricity bill since then went down by 40-50 bucks. Simply by replacing my tv to a 90 Watt monitor.
That's 40-50 bucks more every month. This realization have made me question the power draw of all of my electronics and it has made me really interested in OLED. I don't like the idea that my energy bill gets fucked up because I buy power hungry tech if it doesn't have to be.

My desktop has a 1000 watt powersupply, but the draw seems to not even reach 500 watt at load. It's fucking stupid.



So I've been interested in a 75-inch TV and I looked at the KS8000 and it had a watt draw of 350 watt during operation. That's a lot! Is it HDR that is bringing it down? I then watched the Watt operation on some OLED panels and it was 2-3 times as low. It makes me think that your TV is not really what you pay up front only- It's also the price you pay on your electricity bill.

I thought this was just chump change. I'm floored at how much electricity these things eat!

How many hours a day would you run your plasma? $40-$50 a month is insane.

I had a feeling that most of my electricity bill was from running my plasma. 6-7 hours a day during the week and sometimes 12+ hours on some of my off days where I just turn it on in the morning and leave it on while I do stuff around the house.

In general are OLEDs more energy efficient than LCDs?
 
I've never seen an OLED display unit at a store with IR. All I'm trying to say is that you don't have to "baby" your set, as a lot of posters in here have already mentioned.
The FUD will continue regardless so don't bother arguing. We've been through this in a variety of threads and some folks have no interest in listening. Yes burn-in is possible - if you put the television through the most extreme of torture tests which isn't likely to be the case in normal use. Non babying necessary. Yes, IR is possible and very temporary at worst. Don't believe it? Then go buy an LED and enjoy. No harm and no foul.

The sad part is that arguing these same points ad nauseum takes away from the actual issues with OLED right now which includes lesser motion resolution if your coming from a plasma (and yes it's noticeable at least to me). Along with the definite panel lottery your forced to play with sets having both banding and vignetting issues to some degree. In the worst cases requiring a swap out of the set due to the severity of either one or both. To be fair your playing the same game with LED technology just substitute light bleeding, clouding, etc.

My advice to new OLED owners - don't get fixated on calibration slides long term. My B6 had minor vignetting and banding out of the box. Two manual compensation cycles improved it substantially. A few hundred hours later (and a few dozen auto compensation cycles) has cleared nearly all vignetting and reduced banding to where it's completely unnoticeable in regular content viewing. I couldn't care less whether it appears on a calibration slide or not if I can't see it during actual content viewing. YMMV.
 

Trojita

Rapid Response Threadmaker
Samsung said my parent's KS8000 would ship out the week of the 12th. They never got a ship e-mail. :S

I really hope they've fixed whatever is causing the abnormal amount of QC problems with this set.
 
How many hours a day would you run your plasma? $40-$50 a month is insane.

I had a feeling that most of my electricity bill was from running my plasma. 6-7 hours a day during the week and sometimes 12+ hours on some of my off days where I just turn it on in the morning and leave it on while I do stuff around the house.

In general are OLEDs more energy efficient than LCDs?

I would have my plasma on for 10-15 hours a day.

Yes. I was just checking it out and it surprises me that not more people are talking about it. OLED seems to be very energy efficent. because the blacks.. the blacks are literally the display not lighting up. Think about how genius that is. Other display types light up the entire screen including the blacks. OLED shuts off those areas. when something is black on screen you're looking at the dead black nothingness.

Imagine if they manage to create ventablack as the backdrop in future OLEDs? that would be insane. thats the black that sucks in light. the contrast would be fucking insane if they could make it work in a way that wouldn't fuck up the surrounding light.


Samsung says 2017 they will offer something new. some new alternative to Quantum Dot?
 

holygeesus

Banned
So now we're back to having to baby your TV and not able to do certain things with it like with plasma. Yeah, I'm not going to do that. If I'm using my media PC looking at the Windows desktop and I'm browsing the Internet I'm going to be looking at the Windows taskbar on my TV for potentially hundreds and hundreds of hours over the course of a year. Is this considered "reasonable" usage? Who defines what "reasonable" usage is?

You really don't understand how these sets work do you? That or you are just trying too hard to act ignorant. I have no idea what your motive is, but you aren't doing yourself any favours.
 
Samsung said my parent's KS8000 would ship out the week of the 12th. They never got a ship e-mail. :S

I really hope they've fixed whatever is causing the abnormal amount of QC problems with this set.

Samsung would never do anything they say, even if their entire company was wiped from the face of the earth for not doing so!
Worst I have EVER dealt with.
 
Samsung says 2017 they will offer something new. some new alternative to Quantum Dot?

It's apparently RGB Quantum Dots. They are promising full Rec.2020 coverage. It should be pretty impressive in terms of gamut, though they are still not able or not willing to match Sony in backlight technology. Sony's 600-zones "Master Backlight Drive" is probably not cheap to implement.

I really hope they've fixed whatever is causing the abnormal amount of QC problems with this set.

Samsung TVs are well-known for their poor reliability. It's one thing that has kept me from buying an expensive high-end Samsung set for years now.
 

Based_legend24

Neo Member
All of you speak on zd9 and oled. All I have is a 930d haha ): got mine in September, couldn't help it. I just came to the idea that no huge jump in tech will occur in 2017 that would make me regret my purchase. Like I don't have the bank to drop 2k+ on an oled or Sony oled next year. If push comes to shove I'll sell this to for like 1K if possible. I hope the Sony oleds are amazing. Like they should end up being better due to Sony having the best processing out.
 
Damn it. So I want to get the 49" ks8000 this week (returning my 43" x800d for it). But I just saw that the 55" is now down to 1000 and the 49 inch is back to 1100. I thought making the jump to 49 was going to be a bit ridiculous but I can't see myself going 55. Help :(
 
Damn it. So I want to get the 49" ks8000 this week (returning my 43" x800d for it). But I just saw that the 55" is now down to 1000 and the 49 inch is back to 1100. I thought making the jump to 49 was going to be a bit ridiculous but I can't see myself going 55. Help :(

I would just get the 65. I mean, at some point you're going to upgrade to a bigger house so you'll at least have the TV for it already.
 

Paragon

Member
Is there anyone else beside myself who are interested in OLED partially for their energy savings?
If you have a plasma display, that makes sense. Those things were power hogs.
However OLEDs being emissive displays are not extremely efficient themselves either.
That's part of the reason they have such an aggressive brightness limiter.
The most efficient type of display is an edge-lit LCD, if your concern is power consumption over anything else.

That said, it is almost never worthwhile to replace working hardware with something else for the sake of efficiency or "the environment".
It takes far more resources to manufacture and ship a new display than it does to stick with what you have got and keep using it until it dies.
A more efficient product rarely pays for itself over the useful lifetime of the product.

People should be replacing dead hardware with more efficient options, not buying new hardware to replace perfectly functional but less efficient hardware.
In most cases anyway, I'm sure there will always be exceptions.

I got rid of a 400 Watt 60-inch LG Plasma from 2009 two months ago. My electricity bill since then went down by 40-50 bucks. Simply by replacing my tv to a 90 Watt monitor.
That's 40-50 bucks more every month. This realization have made me question the power draw of all of my electronics and it has made me really interested in OLED. I don't like the idea that my energy bill gets fucked up because I buy power hungry tech if it doesn't have to be.
Is that actually what your bill says, or an estimate based on the power supply ratings? Because saving ~$45/month would be at the extreme side of things.
Very unusual to be saving that much when replacing like-for-like.

Still, if your plan is to replace that plasma with a $4000 65" OLED, it would take ~7.5 years to pay for itself, which is probably longer than most people intend to keep their displays.
And that doesn't factor in its own running costs either, which means it's going to take even longer to pay for itself.

Of course there are other reasons to want to upgrade from a plasma to a new TV, but like I said, it rarely ever saves you money to switch to something more efficient when you work out the full cost vs how long you expect the useful lifespan of the product will be.

My desktop has a 1000 watt powersupply, but the draw seems to not even reach 500 watt at load. It's fucking stupid.
A power supply rating is its maximum load, not its power draw.
The system will only draw what it needs.
Power supplies are at their most efficient around 50% load so if it's drawing 500W—which is very high for a gaming PC these days unless you're using SLI—then your system is probably at peak efficiency under load which is ideal.
With a 1000W power supply you are probably not going to be in the most efficient range when idle, but again it is almost never worth the cost of replacing it for that.
At idle loads the difference in efficiency is only a few watts, not hundreds, and it would take decades for a power supply to pay for itself.
As long as the PSU is 80PLUS rated, it's not something you have to think about. Higher wattage power supplies are usually the most efficient.
Considering that a power supply is something you should be replacing every 5-10 years anyway (always replace them once the warranty expires) that will never happen.


The Best Buy units weren't showing static images for 16+ hours though. People were seeing burn-in from the "LG 4K" logos that would pop up at various times of the demo loop.

It doesn't have to be at 100% brightness to cause issues either (it definitely doesn't help). Some emissive sets get annoying IR even when brightness/contrast is at 60-70%.
Burn-in/Image Retention is all about accumulative wear on emissive displays, not continuous use.
Continuous use will just cause it to happen sooner.

That's why I am still a little concerned about OLED.
You can run compensation cycles as a temporary measure but I don't expect that to be a permanent solution.
Eventually you will have to start lowering the brightness of the entire display to keep it looking uniform.

Since LCDs are not emissive displays, this is not a problem for them.
When you switch the display off - whether that's putting it on standby or killing the power at the wall - the pixels will return to their resting state.
With the exception of some displays with faulty overdrive systems, or situations where the image is kept on-screen 24/7/365 like commercial applications, that's why you cannot cause permanent damage to an LCD display.
Burn-in is technically impossible on an LCD too, what happens in that situation is that the pixels get "stuck" in one state - which is why this can often be undone by displaying flashing images on-screen for a while.
It's not like an emissive display where the pixels are actually "worn down" due to extended use.

Those were the 2015 that were thought or known to have that issue. 2016 had new technology in it not to have that happen.
Image retention is happening on the 2016 OLEDs with HDR games.
It's minor, as image retention goes, but it is happening.
Please don't tell people that it doesn't.

How do they know it was permanent burn in? My VT60 has some gnarly ass image retention but surprisingly even that eventually goes away.
It's true that actual permanent damage is uncommon outside of commercial usage.
Usually it will fade enough over time that it's no longer a concern.
However with plasma TVs the "over time" part could mean weeks or even months before it disappears, depending on your usage.
I know people that had game HUD image retention on their Panasonic plasmas for months before it disappeared. I had burn-in on a Pioneer 5080 that never went away.
It was minor but it happened within the first week of buying the TV and was faint but never disappeared.

Damn it. So I want to get the 49" ks8000 this week (returning my 43" x800d for it). But I just saw that the 55" is now down to 1000 and the 49 inch is back to 1100. I thought making the jump to 49 was going to be a bit ridiculous but I can't see myself going 55. Help :(
I've never known anyone to regret buying a bigger TV than they planned to.
I do know a lot of people that wished they had bought the size larger than they have now.
 

holygeesus

Banned
Burn-in/Image Retention is all about accumulative wear on emissive displays, not continuous use.
Continuous use will just cause it to happen sooner.

That's why I am still a little concerned about OLED.
You can run compensation cycles as a temporary measure but I don't expect that to be a permanent solution.
Eventually you will have to start lowering the brightness of the entire display to keep it looking uniform.

Since LCDs are not emissive displays, this is not a problem for them.
When you switch the display off - whether that's putting it on standby or killing the power at the wall - the pixels will return to their resting state.
With the exception of some displays with faulty overdrive systems, or situations where the image is kept on-screen 24/7/365 like commercial applications, that's why you cannot cause permanent damage to an LCD display.
Burn-in is technically impossible on an LCD too, what happens in that situation is that the pixels get "stuck" in one state - which is why this can often be undone by displaying flashing images on-screen for a while.
It's not like an emissive display where the pixels are actually "worn down" due to extended use.

Image retention is happening on the 2016 OLEDs with HDR games.
It's minor, as image retention goes, but it is happening.
Please don't tell people that it doesn't.

It's true that actual permanent damage is uncommon outside of commercial usage.
Usually it will fade enough over time that it's no longer a concern.
However with plasma TVs the "over time" part could mean weeks or even months before it disappears, depending on your usage.
I know people that had game HUD image retention on their Panasonic plasmas for months before it disappeared. I had burn-in on a Pioneer 5080 that never went away.
It was minor but it happened within the first week of buying the TV and was faint but never disappeared.

No television lasts forever. For every OLED with a fixed cumulative lifespan, you have an LCD whose backlight has similar degrading properties. It is the nature of every piece of technology you own.

The 2016 OLED range has a 100,000 hour lifespan. I've read reports of LCD backlights being rated at 60,000-70,000 hours, if other components don't burn out beforehand.
 

Theonik

Member
No television lasts forever. For every OLED with a fixed cumulative lifespan, you have an LCD whose backlight has similar degrading properties. It is the nature of every piece of technology you own.

The 2016 OLED range has a 100,000 hour lifespan. I've read reports of LCD backlights being rated at 60,000-70,000 hours, if other components don't burn out beforehand.
It's 100k hours to half brightness. You will notice degradation of performance much faster than that and image retention and burn can also appear before that time too. The bigger concern here is also that there has not been that many fundamental improvements to OLED technology so all those strides in HDR are coming off the back of panel reliability.
 

III-V

Member
No television lasts forever. For every OLED with a fixed cumulative lifespan, you have an LCD whose backlight has similar degrading properties. It is the nature of every piece of technology you own.

The 2016 OLED range has a 100,000 hour lifespan. I've read reports of LCD backlights being rated at 60,000-70,000 hours, if other components don't burn out beforehand.

It's 100k hours to half brightness. You will notice degradation of performance much faster than that and image retention and burn can also appear before that time too. The bigger concern here is also that there has not been that many fundamental improvements to OLED technology so all those strides in HDR are coming off the back of panel reliability.

Also, for all technologies, these times to half brightness are likely rated at a standard brightness of SDR content. HDR use will run down the lifetime of these sets, no doubt. HDR on an LCD requires full backlight, while OLED is also current controlled, so higher brightness means more wear on the diode and lowered efficiency.
 
New 65" B6 owner here...I'm sure this has been discussed before, but are there any settings adjustments I can make to minimize the artifacts I sometimes see around moving objects?

Coming from an 8G Kuro plasma, it's my one little gripe about this TV, which is otherwise absolutely incredible. Yesterday I was watching Narcos in 4K for the first time and I was astounded at the color and detail.

Also, is there any way to make the on-screen cursor disappear quicker other than just waiting for it to go away?
 

holygeesus

Banned
Also, for all technologies, these times to half brightness are likely rated at a standard brightness of SDR content. HDR use will run down the lifetime of these sets, no doubt. HDR on an LCD requires full backlight, while OLED is also current controlled, so higher brightness means more wear on the diode and lowered efficiency.

The estimated lifespan, to half brightness, of the 2015 model, was 14 years with the brightness setting maxed out. The 2016 sets are even more efficient.

Despite what people here want to believe, for whatever reason, technology does indeed improve year on year.

http://phys.org/news/2016-03-lifetime-breakthrough-low-cost-efficient-oled.html

New 65" B6 owner here...I'm sure this has been discussed before, but are there any settings adjustments I can make to minimize the artifacts I sometimes see around moving objects?

Turn off all trumotion settings.

Coming from an 8G Kuro plasma, it's my one little gripe about this TV, which is otherwise absolutely incredible. Yesterday I was watching Narcos in 4K for the first time and I was astounded at the color and detail.

Also, is there any way to make the on-screen cursor disappear quicker other than just waiting for it to go away?

Press down on the directional pad.
 

III-V

Member
The estimated lifespan, to half brightness, of the 2015 model, was 14 years with the brightness setting maxed out. The 2016 sets are even more efficient.

Despite what people here want to believe, for whatever reason, technology does indeed improve year on year.

http://phys.org/news/2016-03-lifetime-breakthrough-low-cost-efficient-oled.html

Interesting link. I am not sure that university lab results have trickled down to consumer displays within a years time, but the future is looking bright.

Do you have a link for the brightness decay rate you gave?
 

mrtoaster

Neo Member
Hello wise GAF tv-wizards!

What do you think of LG 65uh615v?
I'm thinking this model because it has 4K, HDR and 100hz panel. The price is good also.

Or would it wise to wait for cheaper OLEDs?
 

holygeesus

Banned
Interesting link. I am not sure that university lab results have trickled down to consumer displays within a years time, but the future is looking bright.

Do you have a link for the brightness decay rate you gave?

It was from LG directly so how much credence it should get is debatable. Either way, you would think with more companies like Loewe, Sony and Panasonic wanting to produce consumer sets, they must have confidence in long term viability.
 

Kyoufu

Member
Hello wise GAF tv-wizards!

What do you think of LG 65uh615v?
I'm thinking this model because it has 4K, HDR and 100hz panel. The price is good also.

Or would it wise to wait for cheaper OLEDs?

It would be wise to wait for cheaper OLEDs or at least cheaper LCDs with better performance than that TV. What's your budget?
 

III-V

Member
It was from LG directly so how much credence it should get is debatable. Either way, you would think with more companies like Loewe, Sony and Panasonic wanting to produce consumer sets, they must have confidence in long term viability.

I did not realize that Kyushu University was associated with LG.
 

The Beard

Member
Hello wise GAF tv-wizards!

What do you think of LG 65uh615v?
I'm thinking this model because it has 4K, HDR and 100hz panel. The price is good also.

Or would it wise to wait for cheaper OLEDs?

Stay away from LGs unless it's an OLED. Their LCD TVs are trash, especially their mid/lower tier sets.

You'd be much better off going with a full array Vizio (if available where you live), they have the best performing budget TVs right now. Otherwise, wait for a cheaper OLED.
 

mrtoaster

Neo Member
Can you guys explain this refresh rate thing? I want 100hz TV and most of the retail stores sais this 65UH615V has 100 hz but some webpages says that it's just 60hz. Confusing...

Maybe I should wait, but still... :(
 
I have a samsung EPP discount, and some of the deals going on right now are insane. Now, obviously this holiday pricing, but is waiting it out for 2017 models still recommended? IOW, will the price meet or beat current EPP holiday pricing when the new models are announced? These deals seem bananas.
 

watership

Member
Why is the "warm2" option always the go to when calibrating the oleds? For some reason I cant stand watching it on that setting, it just looks like a really dim yellowish filter. Am I the only one who uses the "cool" setting? Looks much more vibrant to me.

I have the same feeling about Warm2, but it's mostly because the set needs to be calibrated, especially in the white balance. I set mine up to cool, and it's fine for now. Sometime I'll figure out how to calibrate the set, but given that I don't have much time to muck around to figure that out, I'll just pay for the calibration.
 
I have a samsung EPP discount, and some of the deals going on right now are insane. Now, obviously this holiday pricing, but is waiting it out for 2017 models still recommended? IOW, will the price meet or beat current EPP holiday pricing when the new models are announced? These deals seem bananas.

The deals before BF and during BF were even better. The KS9000 65" and KS8000 65" were selling for $1475 and $1079 respectively. At this point I would wait, it wouldn't hurt. If anything new models at CES will make you either double down and get a new model or seek a deal on a 2016 set.
 
The deals before BF and during BF were even better. The KS9000 65" and KS8000 65" were selling for $1475 and $1079 respectively. At this point I would wait, it wouldn't hurt. If anything new models at CES will make you either double down and get a new model or seek a deal on a 2016 set.

This is testing my resolve for sure. The 55 KS8000 is currently at 729 haha.
 
The deals before BF and during BF were even better. The KS9000 65" and KS8000 65" were selling for $1475 and $1079 respectively. At this point I would wait, it wouldn't hurt. If anything new models at CES will make you either double down and get a new model or seek a deal on a 2016 set.
Uhh... The KS8000 went back down to that price though.

Though it is now claiming to ship in 1-3 days, despite the fact that people who ordered last month still haven't gotten theirs, and mine's been in transit between California and Oregon for the past 6 days.
 

The Beard

Member
the agenda of some people to talk about burn-in or decay rate on OLEDs is astounding. I don't really get it.

It's plasma all over again. You have people who love the tech and either aren't perceptive at all to the real issues that it has, have prestine issue-free panels, or just straight up lie about how the issues of the past aren't issues anymore, "unless you run a static image in torch mode for 48 straight hours." (Horseshit)

Both sides go a bit overboard, but the truth is someowhere in the middle. OLED still has issues with IR. Some panels are worse than others. The compensation cycle is a nice addition, but it doesn't stop IR from appearing from scene to scene while viewing, it just clears any residual IR from your last viewing session.

A lot of people were burned by the Plasma Preachers who claimed IR wasn't an issue before buying their VT60's and ZT60's, which turned out be insane IR magnets for a lot of people.
 

Theonik

Member
the agenda of some people to talk about burn-in or decay rate on OLEDs is astounding. I don't really get it.
It's a dilemma. Back in the plasma era we had the same issue. These are legitimate issues people are experiencing with these displays, to deny them on the premise of 'it's not so bad best TVs' is disingenuous and might mislead people into buying very expensive sets only to be disappointed. On the other hand this perception can linger years after these disadvantages are mitigated and can kill the future of the tech like it did with Plasma.

It's plasma all over again. You have people who love the tech and either aren't perceptive at all to the real issues that it has, have prestine issue-free panels, or just straight up lie about how the issues of the past aren't issues anymore, "unless you run a static image in torch mode for 48 straight hours." (Horseshit)

Both sides go a bit overboard, but the truth is someowhere in the middle. OLED still has issues with IR. Some panels are worse than others. The compensation cycle is a nice addition, but it doesn't stop IR from appearing from scene to scene while viewing, it just clears any residual IR from your last viewing session.

A lot of people were burned by the Plasma Preachers who claimed IR wasn't an issue before buying their VT60's and ZT60's, which turned out be insane IR magnets for a lot of people.
You see both of these in this thread.
 

III-V

Member
the agenda of some people to talk about burn-in or decay rate on OLEDs is astounding. I don't really get it.

Agenda?

This is a thread about television displays and technology.

University researchers are doing work all over the world to improve on the common problem areas associated with OLED displays.
 

Celcius

°Temp. member
There's no difference between a $3 monoprice high speed hdmi cable and a $40 one from bestbuy right? Being a digital signal then both cables will provide the exact same picture quality correct?
 

Theonik

Member
There's no difference between a $3 monoprice high speed hdmi cable and a $40 one from bestbuy right? Being a digital signal then both cables will provide the exact same picture quality correct?
Yes and no. Digital signals are harder to degrade but you can still get irrecoverable data errors on an HDMI stream. These become more common on longer cables and higher frequencies. (which are needed for higher datarate stuff like 4K60)

Make sure your cable is rated for HDMI 2.0b.

e: Getting ahead of myself there hehe
 

Lady Gaia

Member
Make sure your cable is rated for HDMI 2.1.

There is no HDMI 2.1 specification and when there is it almost certainly won't change cable requirements. Cables certified as "High Speed" HDMI are all that's required, but it is still possible to get a bad cable even if it passes certification so a trusted source is worthwhile.
 

III-V

Member
Yes and no. Digital signals are harder to degrade but you can still get irrecoverable data errors on an HDMI stream. These become more common on longer cables and higher frequencies. (which are needed for higher datarate stuff like 4K60)

Make sure your cable is rated for HDMI 2.1.

I agree with what you said although 2.0b is the highest HDMI spec currently available.

2.1 is on the horizon though.
 
It's a dilemma. Back in the plasma era we had the same issue. These are legitimate issues people are experiencing with these displays, to deny them on the premise of 'it's not so bad best TVs' is disingenuous and might mislead people into buying very expensive sets only to be disappointed. On the other hand this perception can linger years after these disadvantages are mitigated and can kill the future of the tech like it did with Plasma.

Price killed plasma. OLED has got to get price down too. You can't even buy an OLED at Costco because of price.

If Plasma had stayed around the IR would have been fully resolved if the 2008 Kuros are any indication. I've still yet to experience IR on my calibrated Kuro, but I did on my 2007 Kuro and I've experienced it on a friend's Panasonics, so it's not like I'm blind to IR.

Price>>>Weight>>IR
 
Top Bottom