• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

Probably. And 4000 nits. 2017 feels like it'll be a nit race (whether that is relevant or not, it feels like something marketing would sink their hooks into).

I honestly don't think any TV will get anywhere near 4000nits in 2017 though, sure they'll be prototypes at the show, but I think they'll be some that will edge 2000 and beyond but that'll be it, plus some improvements with rec2020.

But if OLED's from LG and Sony can hit 1000nit, that will definitely count as a successful year I'd say.
 

Anarion07

Member
I need some help with my B6.

I tried first light in HDR today but it's just sooo dark.
I think I tried everything from black levels on ps4, dynamic contrast etc but no fix.
I noticed that HDR has a 2.2 gamma and it's greyed out so there's no way to change it, anyone know how?

Or can someone with a B6 (and good looking firat light hdr) give me his HDR + ps4 settings?
Thanks!
 

Paragon

Member
Probably. And 4000 nits. 2017 feels like it'll be a nit race (whether that is relevant or not, it feels like something marketing would sink their hooks into).
I'll be pleasantly surprised if displays can hit 4000 nits. I'm expecting it to be more like 2000 nits this year, maybe 3000.
And it is important.

We don't see light linearly.
Meaningful brightness improvements with HDR are measured in "stops" rather than nits.
Each stop is a doubling of brightness.

  100 nits = SDR
  200 nits = 1 stop brighter
  400 nits = 2 stops brighter
  800 nits = 3 stops brighter
 1600 nits = 4 stops brighter
 3200 nits = 5 stops brighter
 6400 nits = 6 stops brighter
12800 nits = 7 stops brighter

So the nits values start to look ridiculous, but when you look at it in terms of "stops", you can see that actually going from 800 nits to 1000 is a small change: 1/4 of a stop.
You wouldn't see much difference at all between a 3200 nit display and a 3500 nit display because the next stop is at 6400 nits.
But you will see a big difference between a 100 nit display and a 400 nit display - still a 300 nit difference - because that is two stops brighter.

I really hope HFR doesn't become a thing. I hate how it looks. Saw one movie too many for my lifetime (hobbit).
You saw a single movie and wrote off the whole thing for life.
Have you considered that The Hobbit was just a mediocre movie with bad effects, and HFR had little to do with it?
Nothing will force you to buy/watch the HFR versions of movies/TV shows, I'm sure there will still be 24p versions available just like you can still buy DVDs today.

I can't wait for HFR to get here.
My main interest is gaming, and 120Hz support is a big deal for that - though we really need variable refresh rate support to take full advantage of it.
It also means that I won't need to use interpolation to dejudder movies and TV shows any more. I love how smooth interpolation makes things, but don't like the glitches that often appear.

But if OLED's from LG and Sony can hit 1000nit, that will definitely count as a successful year I'd say.
I'll be more impressed if they can improve the full-screen brightness from 150 nits.
Peak brightness is far less important.
 

Eliciel

Member
Gaming Ratings should be reconsidered after the OLED LG 2016's model upgrade arrived. Game Mode will change a lot. Other than that I am comfortably at rank #1 with C6. Nice job.
 

BumRush

Member
As someone that would like to buy an OLED after the 2017s are announced, I'm much more interested in brightness and near black uniformity than HFR...at least for this CES
 

holygeesus

Banned
I need some help with my B6.

I tried first light in HDR today but it's just sooo dark.
I think I tried everything from black levels on ps4, dynamic contrast etc but no fix.
I noticed that HDR has a 2.2 gamma and it's greyed out so there's no way to change it, anyone know how?

Or can someone with a B6 (and good looking firat light hdr) give me his HDR + ps4 settings?
Thanks!

Mine was like that till I turned to Limited black level, from automatic, but as you've tried that, I have no idea.
 

Kyoufu

Member
I need some help with my B6.

I tried first light in HDR today but it's just sooo dark.
I think I tried everything from black levels on ps4, dynamic contrast etc but no fix.
I noticed that HDR has a 2.2 gamma and it's greyed out so there's no way to change it, anyone know how?

Or can someone with a B6 (and good looking firat light hdr) give me his HDR + ps4 settings?
Thanks!

First Light and Second Son are both dark and not very good HDR implementations. Some parts look great (like particle effects) but others are just way too dark and unpleasant.

2.2 Gamma is what you'd want it at anyway.
 

III-V

Member
I need some help with my B6.

I tried first light in HDR today but it's just sooo dark.
I think I tried everything from black levels on ps4, dynamic contrast etc but no fix.
I noticed that HDR has a 2.2 gamma and it's greyed out so there's no way to change it, anyone know how?

Or can someone with a B6 (and good looking firat light hdr) give me his HDR + ps4 settings?
Thanks!

Mine was like that till I turned to Limited black level, from automatic, but as you've tried that, I have no idea.

When an HDR game is enabled on the PS4 Pro at 2160p, adjusting black level on the Pro does absolutely nothing. Ideally, it should become greyed out, but it doesn't. The Pro only outputs HDR in limited 2160p.

Change the black level on your TV to limited, as this should fix the issue.

HDR does not have a 2.2 gamma. Thats the power curve used by SDR content. HDR is currently using STMP 2084.

for reference, 100% of the standard power curve = 50% HDR 2084 curve.
 

BumRush

Member
Dumb question: can someone that has the B6 or C6 tell me where the bracket is located in the back (aka at what height?)...is it directly in the middle?
 

The Beard

Member
I'll be pleasantly surprised if displays can hit 4000 nits. I'm expecting it to be more like 2000 nits this year, maybe 3000.
And it is important.

We don't see light linearly.
Meaningful brightness improvements with HDR are measured in "stops" rather than nits.
Each stop is a doubling of brightness.

  100 nits = SDR
  200 nits = 1 stop brighter
  400 nits = 2 stops brighter
  800 nits = 3 stops brighter
 1600 nits = 4 stops brighter
 3200 nits = 5 stops brighter
 6400 nits = 6 stops brighter
12800 nits = 7 stops brighter

So the nits values start to look ridiculous, but when you look at it in terms of "stops", you can see that actually going from 800 nits to 1000 is a small change: 1/4 of a stop.
You wouldn't see much difference at all between a 3200 nit display and a 3500 nit display because the next stop is at 6400 nits.
But you will see a big difference between a 100 nit display and a 400 nit display - still a 300 nit difference - because that is two stops brighter.

You saw a single movie and wrote off the whole thing for life.
Have you considered that The Hobbit was just a mediocre movie with bad effects, and HFR had little to do with it?
Nothing will force you to buy/watch the HFR versions of movies/TV shows, I'm sure there will still be 24p versions available just like you can still buy DVDs today.

I can't wait for HFR to get here.
My main interest is gaming, and 120Hz support is a big deal for that - though we really need variable refresh rate support to take full advantage of it.
It also means that I won't need to use interpolation to dejudder movies and TV shows any more. I love how smooth interpolation makes things, but don't like the glitches that often appear.

I'll be more impressed if they can improve the full-screen brightness from 150 nits.
Peak brightness is far less important.

If the movie is shot at a higher frame rate, how would they still release a 24p version?

Peak brightness is absolutely more important than full-screen. Nobody needs a commercial to pop up on the screen in the middle of the night with an all white background @1800 nits.

Showing bright highlights is more important than torching your retinas in a dark room.
 

ss_lemonade

Member
So I'm wondering if I just have some settings wrong or there's something else up. I currently have a ks9000 and was blown away after seeing those youtube hdr videos from that 4k channel. I decided to get Revenant yesterday to see how 4k hdr blurays looked. Unfortunately, I couldn't get my player (k8500) to display the movie properly. I got lots of white dots, flickering and random No Display messages. I figured it was the hdmi cable, so I tried a second cable that I had been using with my ps4 (non pro). This one was worse with no display at all. A 3rd cable that I had lying around though worked, which is interesting since this 3rd cable would not work at all with my ps4 at 1080p (any other resolution below that works fine). This is probably the first time I've run into issues with hdmi cables. All 3 have "high speed hdmi" written on them too.

Anyway, after testing 4k Revenant out and seeing how good it looked, I tried the 1080p version on my ps4. What exactly are the hdr differences I'm supposed to be seeing? Because to me, other than the resolution bump, they looked nearly identical after swapping back and forth between the 2. As far I can tell, the TV is outputting hdr just fine with the 4k version. Both sources have pretty much the same settings too (max backlight, high smart led, native rgb, no motion plus, no dynamic contrast, warm 2, 0 sharpness, uhd color)
 

Paragon

Member
If the movie is shot at a higher frame rate, how would they still release a 24p version?
Throw away 4/5 of the frames. Maybe add a lot of motion blur.
The Hobbit was shot at 48 FPS and released in 24p for the home market.

Peak brightness is absolutely more important than full-screen. Nobody needs a commercial to pop up on the screen in the middle of the night with an all white background @1800 nits.
HDR is supposed to support up to 400 nits full-screen brightness.
When OLEDs are only achieving 1/3 of that, I think higher full-screen brightness is a lot more important than 1/4 of a stop extra peak brightness.

Anyway, after testing 4k Revenant out and seeing how good it looked, I tried the 1080p version on my ps4. What exactly are the hdr differences I'm supposed to be seeing? Because to me, other than the resolution bump, they looked nearly identical after swapping back and forth between the 2. As far I can tell, the TV is outputting hdr just fine with the 4k version. Both sources have pretty much the same settings too (max backlight, high smart led, native rgb, no motion plus, no dynamic contrast, warm 2, 0 sharpness, uhd color)
Part of it depends on whether your display is properly calibrated.
A lot of TVs have their brightness (backlight/cell light) jacked up way too high for SDR, and are oversaturating the image in a wide gamut mode.
Now that shouldn't look the same as a native HDR image, but may make the difference smaller.

I've even seen some people say that they prefer the SDR image because it enables them to do this, while HDR has a fixed brightness that is not as suitable for brighter rooms.
 

The Beard

Member
So I'm wondering if I just have some settings wrong or there's something else up. I currently have a ks9000 and was blown away after seeing those youtube hdr videos from that 4k channel. I decided to get Revenant yesterday to see how 4k hdr blurays looked. Unfortunately, I couldn't get my player (k8500) to display the movie properly. I got lots of white dots, flickering and random No Display messages. I figured it was the hdmi cable, so I tried a second cable that I had been using with my ps4 (non pro). This one was worse with no display at all. A 3rd cable that I had lying around though worked, which is interesting since this 3rd cable would not work at all with my ps4 at 1080p (any other resolution below that works fine). This is probably the first time I've run into issues with hdmi cables. All 3 have "high speed hdmi" written on them too.

Anyway, after testing 4k Revenant out and seeing how good it looked, I tried the 1080p version on my ps4. What exactly are the hdr differences I'm supposed to be seeing? Because to me, other than the resolution bump, they looked nearly identical after swapping back and forth between the 2. As far I can tell, the TV is outputting hdr just fine with the 4k version. Both sources have pretty much the same settings too (max backlight, high smart led, native rgb, no motion plus, no dynamic contrast, warm 2, 0 sharpness, uhd color)

Since you were using 2 different players, did you pause both on the same frame and compare?


I'm thinking because you had your backlight blown the fuck out on both, you weren't getting an accurate representation of what either BluRay is actually supposed to look like, especially for the standard BluRay. Max brightness and high Smart LED on an edge-lit screen sounds like a nightmare tbh.
 

Weevilone

Member
If you have the backlight blown out, I'm guessing you are also oversaturating everything in SDR. Thus when you do HDR there isn't any headroom. I think it's important to have everything somewhat reasonably calibrated to get a good representation of either format.

On another note, I watched a couple newer Longmire episodes on my OLED last night with the built-in app. Wow, it was fucking gorgeous and I don't think it's even HDR.
 

Rodin

Member
I finally got a PS4 Pro. To use it with my LG OLED EG910V, should i put the console in full rgb with black level on high on the TV, or limited rgb with black level on low on the TV?

Also i have the HDMI in PC mode, and there's no option to use 1:1 pixel mapping: only 16:9 and 4:3. Is 1:1 automatic in this case? The arrows in the overscan settings on the PS4 are right to the edges of the screen.
 
I need some help with my B6.

I tried first light in HDR today but it's just sooo dark.
I think I tried everything from black levels on ps4, dynamic contrast etc but no fix.
I noticed that HDR has a 2.2 gamma and it's greyed out so there's no way to change it, anyone know how?

Or can someone with a B6 (and good looking firat light hdr) give me his HDR + ps4 settings?
Thanks!
Both games have been recently patched to address this issue. There should be a contrast option in the game menu to change. Setting between low/high. When using HDR, set to low. I would recommend kicking up the in-game brightness a notch or two as well if you're still not satisfied. This should net you a really excellent image if you've calibrated your set (make sure your HDMI black level is set to limited as well).

I have the B6 and it's spot on. I'd recommend setting the PS4 Pro (if that's what your using) to 2160p YUV420 to avoid any banding issues as well. Keep in mind that the way the game tries to simulate light changes does lead to a bit of black crush.
 

Kyoufu

Member
Apparently there have been some burn in issues with OLED TVs. An issue that the poster was worried about earlier in the thread

Yeah, I'm aware of his concern.

But he's linking to a thread posted 7 months ago which has barely over 100 posts in it about someone seeing 2015 LG OLEDs with burn-in in a Best Buy store. Not even the 2016 models.

Not sure what to say, really.
 

Anarion07

Member
Apparently there have been some burn in issues with OLED TVs. An issue that the poster was worried about earlier in the thread

If you actually read those posts the burn in is only on BB store units running demos with the same logo all Day long and shutting down without compensation cycle.
Also, burn in =|= image retention


So Basically... don't buy floor models. SHOCKING! LG shot themselves in the foot.
 

Paragon

Member
People have reported image retention on the 2016 OLEDs when gaming in HDR.
It's not like it doesn't happen at all.
What they have said is that it's minor and hasn't been a major issue for them so far, but it does happen.

I wouldn't really consider damage to display models to be a concern though.
We know that burn-in can happen in scenarios where they're running a loop of the same content 16 hours a day and then they kill the power at the wall which prevents them from running compensation cycles, since that happens in standby mode.
 

sector4

Member
I'll be pleasantly surprised if displays can hit 4000 nits. I'm expecting it to be more like 2000 nits this year, maybe 3000.
And it is important.
I totally agree, the Z9D prototype at CES last year was 4000 nits, but who knows how long it will be until that makes it into the consumer version. Word on AVS was that the Z9D would be an 18 month product, so possibly not replaced until CES 2018, but who knows.

Looking like I'll be using Cleveland Plasma or someone similar. Just need to work everything out for cost. Hence, the hope for a post CES drop in price. Maybe late January, early February, will be the time
Nice, fingers crossed for you man! We've been seeing some pretty crazy price drops over here since it launched in September ($6999 > $6500 > $5600 > $4800) so hopefully the same thing happens where you are.
 
So I just setup my new 55" KS8000. I previously had the 65" model for a short time. No light bleed! Hurray! The light bleed was horrible on both the top and bottom of the 65". They're both AA02 panels, made in Mexico, fwiw. Anyway, I didn't return the 65" because of that, but because the 55" will better fit the secondary room I will move it to once I get a 65" or larger 2017 OLED. But the lack of light bleed was a pleasant surprise and makes me much more satisfied using it as a temporary main display.
 
I totally agree, the Z9D prototype at CES last year was 4000 nits, but who knows how long it will be until that makes it into the consumer version. Word on AVS was that the Z9D would be an 18 month product, so possibly not replaced until CES 2018, but who knows.


Nice, fingers crossed for you man! We've been seeing some pretty crazy price drops over here since it launched in September ($6999 > $6500 > $5600 > $4800) so hopefully the same thing happens where you are.

The US price has been holding strong so far, but I'm sure something will be happening sooner or later. Like I said, I won't be making a purchase until over a month from now. 37 days. But I'll coordinate everything with the house closing and the rest. Exciting times really.

75 for life.

I'll have my go fund me link up any day now
 

sector4

Member
The US price has been holding strong so far, but I'm sure something will be happening sooner or later. Like I said, I won't be making a purchase until over a month from now. 37 days. But I'll coordinate everything with the house closing and the rest. Exciting times really.

75 for life.

I'll have my go fund me link up any day now
Haha I'll kick in a few dollars to see someone else get up in here with the Z ;)

Surely it has to come down soon, it sucks you guys don't have the Panasonic set to drive the price down quicker, but it'd have to come down in line with other markets soon enough.

Wow 75" haha that will be something else! :D Can't wait to hear your impressions. Am I remembering correctly that 3D is important to you? If so it might be something else worth comparing in store, not to make your decision harder or anything, but I've heard that 3D is one of OLEDs other strong suits. Not to say it's bad on the Z, I've barely tried it, but it's active vs passive, and I'm not sure if you have a preference.
 
Haha I'll kick in a few dollars to see someone else get up in here with the Z ;)

Surely it has to come down soon, it sucks you guys don't have the Panasonic set to drive the price down quicker, but it'd have to come down in line with other markets soon enough.

Wow 75" haha that will be something else! :D Can't wait to hear your impressions. Am I remembering correctly that 3D is important to you? If so it might be something else worth comparing in store, not to make your decision harder or anything, but I've heard that 3D is one of OLEDs other strong suits. Not to say it's bad on the Z, I've barely tried it, but it's active vs passive, and I'm not sure if you have a preference.

I had active 3D some years ago on a Samsung. I liked it, but you're right. I do need to go test that out. Can't spend all that money and not even get the 3D quality that I'm looking for.
 

III-V

Member
Wow 75" haha that will be something else! :D Can't wait to hear your impressions. Am I remembering correctly that 3D is important to you? If so it might be something else worth comparing in store, not to make your decision harder or anything, but I've heard that 3D is one of OLEDs other strong suits. Not to say it's bad on the Z, I've barely tried it, but it's active vs passive, and I'm not sure if you have a preference.

I'm struggling to switch from my 100" PJ, even with 4k.

The day draws neigh.
 
I'm struggling to switch from my 100" PJ, even with 4k.

The day draws neigh.

I had a 70" in my condo, but included it in the sale. I don't want to go much smaller than that. the 65" screens look nice, but my heart is really set on 75 for the living room/den and a 65" in the loft. We shall see
 

III-V

Member
I had a 70" in my condo, but included it in the sale. I don't want to go much smaller than that. the 65" screens look nice, but my heart is really set on 75 for the living room/den and a 65" in the loft. We shall see

I hear you. A 100 is 78% larger than the 75. Sadness.
 

Geneijin

Member
As someone that would like to buy an OLED after the 2017s are announced, I'm much more interested in brightness and near black uniformity than HFR...at least for this CES
Black uniformity shouldn't be an issue. Gray uniformity however... It's not as good as I want. Debating on exchanging it.
 

NYR

Member
You saw a single movie and wrote off the whole thing for life.
Yes. It's called taste. As in, you try something, didn't like it, and don't want it again. I tried cilantro once - didn't like it. No need to try it again. I'm not a bloody idiot, when I don't like something, you don't try it multiple times just to be sure because someone on the Internet thinks you're wrong.
 

BumRush

Member
No, it's not. I'll give a photo of it later.

edit: 55B6


Thanks. Hmmm, not sure how this will impact me mounting it (above the fireplace)

Yes. It's called taste. As in, you try something, didn't like it, and don't want it again. I tried cilantro once - didn't like it. No need to try it again. I'm not a bloody idiot, when I don't like something, you don't try it multiple times just to be sure because someone on the Internet thinks you're wrong.

To be fair, I think his point was that the Hobbit could have just been bad implementation of the tech.
 
Stranger Things - these pics are just used as an extreme example to show pure white next to pure black and how the Z9D can absolutely go toe-to-toe with OLED for perfect blacks. I pulled the blinds down for this, but I kept the light on the bottom of the TV and reciever on so you can see the blacks haven't been crushed in post processing.

uQNFWL8.jpg

Have time to watch the show today and I took a picture for comparison. Never mind the potato camera quality but looks like you are actually getting crushed black.

Great show btw.
 

Paragon

Member
Have time to watch the show today and I took a picture for comparison. Never mind the potato camera quality but looks like you are actually getting crushed black.
Just looks like camera exposure/display gamma differences to me.
You can't really tell much about how two displays compare from separate photos taken with different cameras and different exposure values.

At best, you can do a controlled comparison if you're using the same camera in manual mode and you have both displays in front of you at the same time.
Even then you have to be really careful and know what you're doing, and the differences can only be relative.
A photo is not a good way to judge a display.

I would just assume that any issues seen in a photograph of a display are caused by the camera/photo, unless the poster is specifically trying to demonstrate an issue that they are seeing.
 

The Beard

Member
Yes. It's called taste. As in, you try something, didn't like it, and don't want it again. I tried cilantro once - didn't like it. No need to try it again. I'm not a bloody idiot, when I don't like something, you don't try it multiple times just to be sure because someone on the Internet thinks you're wrong.

Terrible line of reasoning. When you try foods for the first time which are prepared by shitty cooks, you write them off completely for the rest of your life?

Have time to watch the show today and I took a picture for comparison. Never mind the potato camera quality but looks like you are actually getting crushed black.

Great show btw.

Nah, his pic looks fine to me (could maybe use a touch more brightness, but that could be the result of the camera exposure). Your tv is way too bright which is clipping your whites.
 
This is not an issue on 2016 models unless you buy a demo floor unit used that has the same demo on loop at max brightness. Also don't use these as monitors for extensive periods of time unless gaming/watching movie, that should be a given.

So nothing has changed from plasmas then. Gotcha.

The point is that burn-in is still physically possible, because of how the pixels may wear unevenly on emissive displays. It's most evident on floor model TVs which show static logos and run the same demo loop over and over for a year but the fact that it can occur means that it still exists.

As long as this is the case, no one can claim it can't happen. Because it does and you can go to Best Buy and look at the evidence with your own eyes.
 

Kyoufu

Member
So nothing has changed from plasmas then. Gotcha.

The point is that burn-in is still physically possible, because of how the pixels may wear unevenly on emissive displays. It's most evident on floor model TVs which show static logos and run the same demo loop over and over for a year but the fact that it can occur means that it still exists.

As long as this is the case, no one can claim it can't happen. Because it does and you can go to Best Buy and look at the evidence with your own eyes.

You missed one key fact: those TVs weren't running compensation cycles after being turned off, therefore the panel couldn't clear any IR present at the time. This never happens in real world day-to-day use because you're running compensation cycles once you're done for the day/night.
 
You missed one key fact: those TVs weren't running compensation cycles after being turned off, therefore the panel couldn't clear any IR present at the time. This never happens in real world day-to-day use because you're running compensation cycles once you're done for the day/night.

So go to Best Buy and look at one the floor models and manually run the "compensation cycle" and see what happens. I'm willing to bet that nothing will happen, because the "compensation cycle" doesn't magically evenly wear pixels which have spent a year showing the same thing and which have therefore worn very unevenly.
 

BumRush

Member
So nothing has changed from plasmas then. Gotcha.

The point is that burn-in is still physically possible, because of how the pixels may wear unevenly on emissive displays. It's most evident on floor model TVs which show static logos and run the same demo loop over and over for a year but the fact that it can occur means that it still exists.

As long as this is the case, no one can claim it can't happen. Because it does and you can go to Best Buy and look at the evidence with your own eyes.


"Can't happen" is not equal to "not an issue if you're reasonable" (still image for 16 straight hours, not running compensation cycles, etc.)
 
Nah, his pic looks fine to me (could maybe use a touch more brightness, but that could be the result of the camera exposure). Your tv is way too bright which is clipping your whites.

Brightness is fine on my TV, just the shitty LG phone camera makes it look like its clipping.
My point is his picture is missing some details on the darker side (around the ear and shoulder) which make me think he's having crushed black.
 
"Can't happen" is not equal to "not an issue if you're reasonable" (still image for 16 straight hours, not running compensation cycles, etc.)

So now we're back to having to baby your TV and not able to do certain things with it like with plasma. Yeah, I'm not going to do that. If I'm using my media PC looking at the Windows desktop and I'm browsing the Internet I'm going to be looking at the Windows taskbar on my TV for potentially hundreds and hundreds of hours over the course of a year. Is this considered "reasonable" usage? Who defines what "reasonable" usage is?
 

The Beard

Member
Brightness is fine on my TV, just the shitty LG phone camera makes it look like its clipping.
My point is his picture is missing some details on the darker side (around the ear and shoulder) which make me think he's having crushed black.

I don't think so. It looks like everything that was meant to be seen, is seen on his. Her left ear is still totally visible but dark, as it was (likely) meant to be.Yours is blowing out the areas around the shadows, but not really showing any added detail.

I think it's mostly due to the different cameras and distances though. His was clearly taken from further away than yours was. If you took the pic further back it probably wouldn't as blown out. Also, if he took his pic a little closer, I'm sure it'd bring out more brightness.
 

BumRush

Member
So now we're back to having to baby your TV and not able to do certain things with it like with plasma. Yeah, I'm not going to do that. If I'm using my media PC looking at the Windows desktop and I'm browsing the Internet I'm going to be looking at the Windows taskbar on my TV for potentially hundreds and hundreds of hours over the course of a year. Is this considered "reasonable" usage? Who defines what "reasonable" usage is?

Come on man, you know damn well that leaving the same static image for 16+ hours a day at 100% brightness and NEVER turning your TV off by using the power button (which runs a compensation cycle) isn't normal TV behavior for 99.99999999% of people. If it is for you, great, don't get an OLED. But please don't act like not doing the above is having to "baby" your TV.
 

Weevilone

Member
So nothing has changed from plasmas then. Gotcha.

I don't even understand what your objective is. You could technically get burn-in on oldschool CRT's. I've seen burn-in on LCDs including a Macbook Pro that I had.

Given enough abuse you can permanently damage about any display.
 
Top Bottom