• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

Thanks for the reply. I'll post my current HDR settings tonight when I get back from work. I remember reading somewhere it helps when you put the PS4 setting on YUV 4-2-2(0) instead of automatic. Is there any truth in that?
.

That depends on whether you're on base or pro, last time I checked
 

ToD_

Member
So I got my 65" C7 yesterday and I'm pretty happy with it. After some tweaking I managed to settle on settings for SDR gaming and regular TV watching. I noticed some colours are brighter than others. Like in Forza Horizon 3 on OG Xbox One I noticed the orange in the UI and score stuff is too bright. I can't really put my finger on what setting is best to tweak this? Colour, Tint or Colour Temperature? Tried a bunch of stuff but it either alters other colours too much or the brighter gets worse.

Have you checked color gamut? It's best to set it to auto. The wide setting in particular can make certain colors look oversaturated.

A respected calibrator at AVSforum (Chad B) suggests the following:
Color Gamut: Auto
Color: 53 - 56 range
Tint: R2 - R3

I calibrated mine with color at 53 and tint at R3 with some tweaks in the Color Management System. Don't touch the CMS unless you have a meter, though.
 

MauroNL

Member
Have you checked color gamut? It's best to set it to auto. The wide setting in particular can make certain colors look oversaturated.

A respected calibrator at AVSforum (Chad B) suggests the following:
Color Gamut: Auto
Color: 53 - 56 range
Tint: R2 - R3

I calibrated mine with color at 53 and tint at R3 with some tweaks in the Color Management System. Don't touch the CMS unless you have a meter, though.
Thanks, I'll check that out tonight. This applies to SDR though, correct? Also, on Xbox One what is the best colour range setting 8,10 or 12? I guess 10 is best 'cause of the 10-bit panel.
 

ToD_

Member
Thanks, I'll check that out tonight. This applies to SDR though, correct? Also, on Xbox One what is the best colour range setting 8,10 or 12? I guess 10 is best 'cause of the 10-bit panel.

Those settings only apply to SDR.

I am not sure about your Xbox One setting (I don't have one), but I'd just set it to 12 and see if it works. The TV is still capable of processing 12-bit signals. In fact, Dolby Vision is 12-bit. Higher bits for SDR content should just reduce color banding. The Xbox and TV (HDMI spec) does limit 4k RGB 4:4:4 at 60Hz to 8-bit, but for movies at 24Hz 12-bit is supported by the TV. In addition, 1080p RGB 4:4:4 at 60Hz will work at 12-bit as well - so that's the vast majority of Xbox One games, I believe.
 
Best aspect ratio is just scan enabled and original, right?

I ask this because just now I noticed the right black border of the screen being slightly bigger than the left border. I then changed aspect ratio from original to 16:9 and the borders were fine again. Changed it back to original and it was the same. Odd.
 
Best aspect ratio is just scan enabled and original, right?

I ask this because just now I noticed the right black border of the screen being slightly bigger than the left border. I then changed aspect ratio from original to 16:9 and the borders were fine again. Changed it back to original and it was the same. Odd.

That's the screen shift feature that supposedly helps prevent burn in. You can disable it in the settings (I very much doubt it actually helps considering that most static image parts are wider than a few pixels anyway).
 

Trago

Member
Thought I'd turn to this thread to see if anyone knows if HDMI 2.1 will mean that televisions will have support for G Sync specifically?

I'm planning on having a living room gaming PC setup in the future. Been reading article after article about HDMI 2.1 and I saw some mentions of Freesync compatibility with PC and the Xbox One X, but nothing on G Sync. Or is Game Mode VRR it's own thing compatible with any GPU??

I'm confused.
 

Ashhong

Member
I'm not sure what cheap to you is, but Xbox One S is regularly $200 new with a game. There's the Samsung and Phillips players too that are about the same price. Just depends what you want really.

Hopefully DV players go down in price, or at least come in more options than like 3. Feels a little silly to not buy a DV player when you have a DV TV.

But yeah if you don't watch Blu-Rays regularly, all that investment might not be worth it.
Netflix/Amazon is a good start, but UHD discs are really something else. With the coming price hike to Netflix, I might actually cancel as I feel it's no longer cheap enough to just sub to without watching regularly.

200 isn't bad, especially since it's also an Xbox, but you just reminded me of DV. I'll hold off until at least CES and hopefully there will be some cheap DV players.

Has anybody played Hellblade on an OLED? I turned it on last night and granted I was high, but apparently I set the contrast way too low, and everything was super dark. I thought it was part of the game though because it kept saying how the darkness was coming, and I played for a good 15 minutes walking around in shadows not able to see anything. Really showed off the black levels of the TV tho :lol
 

Kyoufu

Member
I've heard so much bad feedback from Xbox One S users regarding UHD Blu Ray playback (compatibility issues) that I feel like you're probably better off with a dedicated player if you're at all serious about UHD Blu Rays.
 
Got my C7 today. All I can say is: HNNNNNNNNNNNNG

Nice! :)

Yeah Lost Legacy looks so good on my A1E.

Wow, your restraint is much better than mine. First things I did was buy a UHD player w/ 3 movies. I'm up to 16 (!) now. Then the Netflix 4k upgrade. Then PS4 Pro HDR games. I fell in. I actually have way more 4k content than I have time to enjoy tbh, which isn't a bad thing.

Haha, same here, I jumped on the Samsung player the day it came out to have a UHD player asap.
 
That's the screen shift feature that supposedly helps prevent burn in. You can disable it in the settings (I very much doubt it actually helps considering that most static image parts are wider than a few pixels anyway).

Oh so the screen shift option actually does that? It slightly makes the borders bigger?
 

Mrbob

Member
Considering buying new tv but might hold out for 2018 to see if we get adaptive sync.
I think it'll happen for some high end tvs but we have to see if Nvidia or amd offer compatible cards as well. I think AMD will but I'm doubtful Nvidia supports HDMI 2.1 when they have g sync.
 

Kudo

Member
Does anyone use their TV as a PC monitor as well?

I don't use it as monitor but I have my OLED connected to PC too, for games and media.
I had 32" TV in past as monitor and it was fine, but I feel like having a 55" on my desk is bit overkill.
 

Trago

Member
I think it'll happen for some high end tvs but we have to see if Nvidia or amd offer compatible cards as well. I think AMD will but I'm doubtful Nvidia supports HDMI 2.1 when they have g sync.

That's what I'm wondering though. If it's only Freesync, then I'll have to go AMD on a new build.
 
Absolutely loving the Samsung MU9000 we bought. The built in SteamLink works incredible, and that's with both the TV and PC running on wireless.

Can't say I have any complaints aside from how heavy the TV was. I didn't expect it to be 30kg. How do you remove the grease/smudge marks? We have wipes we use on our phones but it looks like they're just making it worse.

I got a nice surprise when I was setting up my Samsung login on the TV to install apps. My Samsung phone just vibrated and asked if I wanted to share the login to my TV so that I don't have to type it. Small thing but really welcome.

Oh yeah is there any reason why I shouldn't set HDR on all HDMI ports?
 

Adobe

Member
If you aren't HDR gaming in a bright ass room, the B7/C7 is still the most impressive screen you can get, firmware bugs or no

and what about the panel uniformity of these OLED's? (i.e. banding, vignetting, burn-ins etc.)
Do people really complain about small things or is it a problem?
 

torontoml

Member
and what about the panel uniformity of these OLED's? (i.e. banding, vignetting, burn-ins etc.)
Do people really complain about small things or is it a problem?
I've got one band a little to the right of center that I notice with some regularity. Though it really is content dependant. Didn't see it at all during GOTG 2, during John Wick 2 in the tunnels, it was a little more prevent but would be there for a second and then gone again, but didn't notice it anywhere else during the movie. Haven't noticed any other issues.
 

LilJoka

Member
and what about the panel uniformity of these OLED's? (i.e. banding, vignetting, burn-ins etc.)
Do people really complain about small things or is it a problem?

Mine is band free, at least I can't see them at all. But around 40% see tint issues towards one edge or area on 65". My C7 65 is slightly yellow tinted towards the left side for a few inches, my B6 65 was the same. 55" screens are pretty flawless.
 

Mrbob

Member
That's what I'm wondering though. If it's only Freesync, then I'll have to go AMD on a new build.
Some will argue otherwise but Im unconvinced Nvidia will give up gsync to support HDMI 2.1.

They don't have to support HDMI 2.1 adaptive sync.
 
Oh so the screen shift option actually does that? It slightly makes the borders bigger?

Well, it periodically shifts the whole screen left and right, so it will cut off a few pixel lines from one side of the image and add a few black pixel lines to the other side.
Since you mentioned that one border was bigger I assume that's the cause. Playing with the aspect ratio options would reset screen shift to its default position.
 
Well, it periodically shifts the whole screen left and right, so it will cut off a few pixel lines from one side of the image and add a few black pixel lines to the other side.
Since you mentioned that one border was bigger I assume that's the cause. Playing with the aspect ratio options would reset screen shift to its default position.

Yeah, good point there. I think I'm gonna leave it on though because there don't seem to be any other options that do something similar and when it comes to IR and burn in I'm not taking any chances.
 

MauroNL

Member
Those settings only apply to SDR.

I am not sure about your Xbox One setting (I don't have one), but I'd just set it to 12 and see if it works. The TV is still capable of processing 12-bit signals. In fact, Dolby Vision is 12-bit. Higher bits for SDR content should just reduce color banding. The Xbox and TV (HDMI spec) does limit 4k RGB 4:4:4 at 60Hz to 8-bit, but for movies at 24Hz 12-bit is supported by the TV. In addition, 1080p RGB 4:4:4 at 60Hz will work at 12-bit as well - so that's the vast majority of Xbox One games, I believe.
Okay, so I played some more with the HDR settings and its a lot better now. I'm using HDR Standard because HDR Game is just way too dim and I can't really get it anywhere near playable. Left most intact and turned off all enhancements. Colour 50, Tint 0 and Temperature W45. I'm still a bit confused about hdmi black level in HDR. The auto setting on PS4 doesn't really work great IMO. For SDR I use Low, but in HDR its too dark/crushed but using Limited makes it too pale/grey. What is best in the different scenarios? SDR is Limited PS4 - TV Low but HDR should be same or both High/Full? If I find either too dark or grey is it best to tweak this with Brightness?

Also how does everyone use OLED Light below 40 in SDR/TV?! Its so dim IMO, or am I doing something wrong. I have mine at 75-80 now but everywhere I hear nobody going above 40 really outside HDR.
 

Mrbob

Member
Can we expect to see high refresh rate input and variable refresh rate TVs in 2018?
The 2017 LG Oleds can do 1080p 120 hz right now so id expect vrr support for 120hz next year.

Here is the thing though...AMD is prepping Freesync 2 right now so it's possible vrr in 2020 is better than vrr in 2018.
 
I think it'll happen for some high end tvs but we have to see if Nvidia or amd offer compatible cards as well. I think AMD will but I'm doubtful Nvidia supports HDMI 2.1 when they have g sync.

This is the main reason I can't see gsync/vrr being a real thing anytime soon, as in the next 5 years. The tech has to come out, and then nvidia/amd have to go out of their way to implement the tech. It'll probably come, but not exactly soon.

I'm ok with my 2017 C7. Loving it. (Wish the HDR gaming brightness was corrected.)
 

dsk1210

Member
Some will argue otherwise but Im unconvinced Nvidia will give up gsync to support HDMI 2.1.

They don't have to support HDMI 2.1 adaptive sync.

They will lose a few customers if they do not support adaptive sync on HDMI 2.1, I have always went Nvidia but AMD support only would push me over to them easily.
 

III-V

Member
But I want g sync support specifically, and I suspect Nvidia are gonna be jerks about it and not support new television sets.

I don't think the TV manufacturers will go out of their way for Nvidia licensing when VRR is part of the HDMI standard, and its a nice addition to their game mode features. But, thats just me.
 

Trago

Member
I don't think the TV manufacturers will go out of their way for Nvidia licensing when VRR is part of the HDMI standard, and its a nice addition to their game mode features. But, thats just me.

Just so I'm clear, Game Mode VRR is a separate new standard?

So there won't be support for Freesync/g sync? It's its own thing?
 

III-V

Member
Just so I'm clear, Game Mode VRR is a separate new standard?

So there won't be support for Freesync/g sync? It's its own thing?

It will be its own thing:


"Game Mode VRR features variable refresh rate, which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing."

more here:

https://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

Also some comments from Richard at DF

http://www.eurogamer.net/articles/digitalfoundry-2017-hdmi-2-1-specifications-revealed
 

Trago

Member
It will be its own thing:


"Game Mode VRR features variable refresh rate, which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing."

more here:

https://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

Also some comments from Richard at DF

http://www.eurogamer.net/articles/digitalfoundry-2017-hdmi-2-1-specifications-revealed


Nice, thanks.
 

supersaw

Member
If VRR is it's own thing it might prompt nvidia to swallow it's pride and include it over hdmi only and leave gsync as an option for people that want it for more exotic features like multi monitor setups over display ports.
 

Mrbob

Member
I think we'll end up seeing gsync enabled tvs eventually with displayport connections. Curious how many times Nvidia have decided to go with an open standard in place of doing something proprietary themselves.
 

Kyoufu

Member
I think we'll end up seeing gsync enabled tvs eventually with displayport connections. Curious how many times Nvidia have decided to go with an open standard in place of doing something proprietary themselves.

Why Displayport? HDMI 2.1 has all the bandwidth needed.
 

Mrbob

Member
Maybe it can, but gsync is a chip inside the screen that syncs the screen with the gpu. Connection doesn't matter as much as if tv manufacturers will want to add the cost of this chip. I don't see Nvidia giving up not trying to make money by selling this chip for every tv by strictly using HDMI 2.1 standard vrr instead. Perhaps they will create a proprietary HDMI 2.1 vrr setup without their chip but sell with a license fee. Nvidia will get their money one way or another. Especially now after their stock price has risen so much. Need to keep it going higher.


Once consoles and AMD start supporting VRR over HDMI Nvidia will have no choice but to suck it up and fall in line.
Unfortunately, we need AMD to step it up in gpu sales for Nvidia to take notice. Intel didn't care about AMD until Ryzen started taking 50% cpu share.

I'll be pleasantly surprised if Nvidia supports the HDMI 2.1 vrr standard, but the way the company does business makes me think they only will if they have too.
 

Hawk269

Member
I've heard so much bad feedback from Xbox One S users regarding UHD Blu Ray playback (compatibility issues) that I feel like you're probably better off with a dedicated player if you're at all serious about UHD Blu Rays.

They are constantly doing updates to the app the runs the player. I have not had any issues in the last few months. It has been pretty much worry free of issues. At the beginning there was some slight audio drop outs and on occasion a weird pause, but most of those issues have been resolved. A friend of mine says every once in a while he gets a odd audio drop out but he says it has been happening less since the last update.
 
Obviously this is all speculation...


But why would a LG/Sony/Samsung go out of their way to court Nvidia and work out a deal to put the Gsync chip in their tv? It would just be a huge headache, and bring the cost up. Don't Gsync monitors still go for a premium compared to their other counterparts?

While most people in this forum are vocal about having gaming PCs hooked up to their TVs, are we really a big enough market?

The practicality of it all, and how long it'll realistically take for all the stars to align, makes it seems like - even though the tech might be close to ready - there's a lot that still needs to be worked out. I'm guessing it's 2-4 years out.
 
Okay, so I played some more with the HDR settings and its a lot better now. I'm using HDR Standard because HDR Game is just way too dim and I can't really get it anywhere near playable. Left most intact and turned off all enhancements. Colour 50, Tint 0 and Temperature W45. I'm still a bit confused about hdmi black level in HDR. The auto setting on PS4 doesn't really work great IMO. For SDR I use Low, but in HDR its too dark/crushed but using Limited makes it too pale/grey. What is best in the different scenarios? SDR is Limited PS4 - TV Low but HDR should be same or both High/Full? If I find either too dark or grey is it best to tweak this with Brightness?

Also how does everyone use OLED Light below 40 in SDR/TV?! Its so dim IMO, or am I doing something wrong. I have mine at 75-80 now but everywhere I hear nobody going above 40 really outside HDR.

Do you have a standard PS4 or PS4 Pro? For standard PS4 and HDR you should pair Full RGB range with High black level to create a correct picture. For PS4 Pro you should pair Limited with Low. Don't use the PS4's automatic setting for RGB range, it is unreliable.

Right now a lot developers seem to make dark portions of the picture unnecessarily dark in HDR (I guess in a failed attempt to make full use of HDR's dynamic rage). Infamous SS is a prime example of this. So even if you have everything set up correctly it can still seem dark in the darker portions of the image.
 
So I got my second XE93 two days ago. This one also has a stuck pixel. But this time only one and it's so small I can't see it, will keep the TV.

For me its perfect, played yesterday with a friend on a sunny day in my flat on top of the building and the TV battled the sunlight with success. Uniformity of the set is good and I don't have banding, clouding or DSE. Local dimming works very well, using it with set to middle. Upscaling of the TV is wonderful, watched the last episode of Narcos with subtitles and I don't noticed any blooming or haloing maybe with a HDR source I will.

Overall it's a great experience and I'm very happy with my choice.
 
Thought it would be cool to hook up the first Xbox and PS2 to my B6 but of course it didn't have the RGB scart thing on the back anymore. What would i need to be able to hook these two up?
 

MauroNL

Member
Thought it would be cool to hook up the first Xbox and PS2 to my B6 but of course it didn't have the RGB scart thing on the back anymore. What would i need to be able to hook these two up?
Wondering this as well for my C7. The quick guide shows some yellow port on the back for both AV and Component but its not on my EU model. Maybe its US only?

Do you have a standard PS4 or PS4 Pro? For standard PS4 and HDR you should pair Full RGB range with High black level to create a correct picture. For PS4 Pro you should pair Limited with Low. Don't use the PS4's automatic setting for RGB range, it is unreliable.

Right now a lot developers seem to make dark portions of the picture unnecessarily dark in HDR (I guess in a failed attempt to make full use of HDR's dynamic rage). Infamous SS is a prime example of this. So even if you have everything set up correctly it can still seem dark in the darker portions of the image.
Thanks, I'll try that out tonight to see if it looks right. I have the standard PS4 btw. This only works for HDR right, or do these settings fit SDR too?
 
Thanks, I'll try that out tonight to see if it looks right. I have the standard PS4 btw. This only works for HDR right, or do these settings fit SDR too?

For SDR it's just important that you match the black levels (so either Limited with Low or Full with High). For HDR on standard PS4 you need to set Full/High because for some reason the combination Limited/Low produces black crush. Since you can't set RGB range individually on PS4 for SDR and HDR I recommend setting everything to Full/High so that you don't have to constantly change settings depending on content.

There's a catch though: If you use your PS4 to watch BluRay movies the PS4 will automatically change to Limited RGB. When I still had my standard PS4 I used the TV's game modes will High black for gaming and when watching a BluRay I changed to ISF dark room with Low black.

Should you ever upgrade to a Pro you should change all settings to Limited and Low respectively.
 

mitchman

Gold Member
I've heard so much bad feedback from Xbox One S users regarding UHD Blu Ray playback (compatibility issues) that I feel like you're probably better off with a dedicated player if you're at all serious about UHD Blu Rays.

A have a few UHD Blu-rays (Planet Earth II, Blade Runner Final Cut, The Martian, that moses movie) and haven't had any issues with. Any particular movies they had issues with?
 
Top Bottom