• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

TheBoss1

Member
I mean. No shit. They chose to put the cheaper processor on the low-mid sets from the 90E down to save money but also to give a USP for the higher end sets.

Well I think the X900E has the pre-Z9D flagship processor and the other cheaper sets have cheaper processors.
 

RoadHazard

Gold Member
I honestly believe Sony excluded DV from lesser sets than the 930E for bullet points. Why does a TCL P607/605 that is half the price have DV, but Sony can't support it? I'm calling bs.

The thought has definitely crossed my mind (and I don't think the 930E costs significantly more to produce than the 900E), but then why don't they actually use it as a bullet point? There's nothing about DV on their pages for these TVs, you have to find random news articles to get that info. They do advertise the X1 Extreme itself, but that means shit to the average person (but so does DV, I suppose).

One of the reasons I bought my TV (C7) is because it supported all the different HDR formats, so I specifically wanted Dolby Vision support. I'll try to answer the question the best I can in a non scientific eye, but yes Dolby Vision looks better. It handles differences in contrast better. Beyond this, it also handles colors better too. For example, if there is a neon sign you can see the neon light shining off an object better. Now are the differences enough to upgrade to the 930e? That's tough for me to answer for you. There would be some bias in my answer because I bought my TV for DV support.

Maybe I can put it in gaming terms. HDR10 seems like the Xbox One of HDR while Dolby Vision is the PS4 of HDR. There is a difference but you still have access to similar content no matter which format you use. If DV is available I'll be watching a show/movie in that over HDR10. But it doesn't make HDR10 useless. HDR10 still looks good.

Thanks. What I've read is that stuff like very dark scenes (with low peak brightness) benefits the most, since DV lets the TV know what the range for the current scene is, so the shades can be remapped to what the TV can do so you get better shadow detail etc.
 

Theonik

Member
The thought has definitely crossed my mind (and I don't think the 930E costs significantly more to produce than the 900E), but then why don't they actually use it as a bullet point? There's nothing about DV on their pages for these TVs, you have to find random news articles to get that info. They do advertise the X1 Extreme itself, but that means shit to the average person (but so does DV, I suppose).
Cause they do not have DV support yet. Just a promise in CES to implement it. It's happening soon though.

You think the TCL P607 has a better processor than the X1? I know the X1E is more powerful than the X1 but I doubt the TCL is even close to the X1.
Different implementations. Vizio also did DV for a while before Sony chose to do it. They had a hardware solution.
 

Gitaroo

Member
You think the TCL P607 has a better processor than the X1? I know the X1E is more powerful than the X1 but I doubt the TCL is even close to the X1.

X1&X1E are image processors, one of the best in tvs for their time; doubt that TCL tv is even close. The SoC that run the OS however might be better. Sony cheap out on using garbage mediatek SoC for their tv.
 

tmdorsey

Member
I asked you in a previous page but can you cite that claim?
Why would Sony include DV hardware on the Z9D more half a year before they announced any intention to release DV capable sets and more than a year before they actually did?

Hmm, sure? I thought that chip was simply powerful enough to do it through software (Dolby has created such a solution for devices without the dedicated DV hardware). If true, why won't it be enabled until an update in the fall? And why wouldn't Sony include it as a bullet point for those sets?

This post here from user AAROW-TV who is certified THX ISF calibrator. He claims he got the info direct from Sony.


As to the holdup with applying the update, who knows?
 
As Redline said, it definitely depends on where you sit and the size of the screen. I'm sure if you're 4-5' from an 80" 4K or 4-5' from an 80" 8K, some folks could discern a difference.

So let's assume "proper viewing distance" then. I'm just curious at what point tv resolutions will stop increasing because there's no more visual improvement to be had.
 

RoadHazard

Gold Member
So let's assume "proper viewing distance" then. I'm just curious at what point tv resolutions will stop increasing because there's no more visual improvement to be had.

We're there right now. At anything most people would consider normal viewing distance vs size, we don't need more than 4K. Unless we drastically increase our sizes and get literal TV walls. The next step is 8K, and you need absolutely huge screens (or sit like a foot away) to notice that improvement.
 

Theonik

Member
This post here from user AAROW-TV who is certified THX ISF calibrator. He claims he got the info direct from Sony.
I don't know what Certified ISF calibrator really has to do with anything. Might as well say he was a plumber and he'd be as qualified to know such things.

Aside from that I'm not sure I'm interpreting his post correctly. It's hard to tell without knowing who told him this to pinpoint what was meant. There is nothing about dedicated DV hardware in the set in that post. They might as well be talking about the X1E itself being able to decode DV in hardware.

So let's assume "proper viewing distance" then. I'm just curious at what point tv resolutions will stop increasing because there's no more visual improvement to be had.
Proper viewing distance is resolution dependent. Eventually we're better served using 16K per eye on an HMD for all our viewing. That's the ultimate limit of the normal human eye. Double that for people with 20/10 vision.
 

RedAssedApe

Banned
You think the TCL P607 has a better processor than the X1? I know the X1E is more powerful than the X1 but I doubt the TCL is even close to the X1.

i'm assuming that the TCL has a hardware implementation for dolby vision and Sony is doing it by software (if it needs to be enabled via firmware)

seeing as how there are complaints about how Android TV runs poorly on the x900e

That or sony is intentionally doing it as a selling point for the more expensive tvs (assuming the profit margin is higher on those). that and these companies put out new tvs every year. Its always the same song and dance. Leave room for improvements for next year's model.
 
Panasonic's high end summer 2017 EZ1002 OLED costing £6000 doesn't support DV, yet a $600 TCL does

Key question is how good will it be on current TVs. Right now it's minimal from the little I've seen but will it start showing its worth soon on current sets with 700 nits or less brightness and LEDs with 1000+ but flawed dimming zones. If yes then it could impact your resale value if DV stands out on current sets in 2018/19. Seems more like DV will show its worth in a few years time.

I could do with getting a 900e or 930e but the longer I leave it the more it's worth waiting till next year.
 

tmdorsey

Member
I don't know what Certified ISF calibrator really has to do with anything. Might as well say he was a plumber and he'd be as qualified to know such things.

I mentioned it to make that point that he has contacts with Sony product support

Aside from that I'm not sure I'm interpreting his post correctly. It's hard to tell without knowing who told him this to pinpoint what was meant. There is nothing about dedicated DV hardware in the set in that post. They might as well be talking about the X1E itself being able to decode DV in hardware.

That post was a follow up to this post here where he clearly states the Z9D has the newer Mediatek SoC that has Dolby Vision decoding capability and that the decoding was done in hardware not software. The followup post was to get clarification if only the Z9D has it or do the other X1 Extreme sets have it also.
 

Mrbob

Member
Could you unpack this a bit?

Do you have the same issue? But is it only sometimes? Some of your games (Bayo, W3) seem to work fine?

It seems to me that there is a way to also address this audio issue with CRU, I just don't have enough technical background on how drivers work etc.

The same way you can configure CRU to hack in resolutions, it appears you can hack in audio formats that the receiving device should see as supported. I just can't seem to penetrate a lot of what I'm reading about how to do it.

The display port adapter for audio just seemed like an easier solve.

In windows sound my HDMI audio output doesn't have a 5.1 setting anymore. Only stereo. When I click on properties it states my audio device can access Dolby digital and DTS. When I play a game it sounds like I still get 5.1 sound despite windows saying I only have access to stereo.
 

Theonik

Member
I mentioned it to make that point that he has contacts with Sony product support
That post was a follow up to this post here where he clearly states the Z9D has the newer Mediatek SoC that has Dolby Vision decoding capability and that the decoding was done in hardware not software. The followup post was to get clarification if only the Z9D has it or do the other X1 Extreme sets have it also.
Sony support don't know shit. lol
 
In windows sound my HDMI audio output doesn't have a 5.1 setting anymore. Only stereo. When I click on properties it states my audio device can access Dolby digital and DTS. When I play a game it sounds like I still get 5.1 sound despite windows saying I only have access to stereo.

Thank you. This is helpful.

I'm assuming you have the exact same issue I do.

IIRC you said a ways back you have a Yamaha receiver? What is the receiver sound mode set on when you use the PC?

I'm assuming that your receiver is actually only receiving stereo sound and simulating 5.1 which is very, very different than the discrete 5.1 that you (and I) should be getting.
 

Mrbob

Member
Thank you. This is helpful.

I'm assuming you have the exact same issue I do.

IIRC you said a ways back you have a Yamaha receiver? What is the receiver sound mode set on when you use the PC?

I'm assuming that your receiver is actually only receiving stereo sound and simulating 5.1 which is very, very different than the discrete 5.1 that you (and I) should be getting.

Yeah Yamaha 681. I run it in surround decode mode on pc so it's possible it's only simulating 5.1.
 
Obviously, there are factors which only you can answer, but some things to consider:

- when will you buy "the next" TV after this one? If you're a 2-3 year TV owner, go for the Vizio now and start saving for 75" plus later on. If you plan to have it for 5+ years, maybe hold out for the 75" so that you're not constantly thinking you should have gone bigger.

- besides the size, what do the reviews say about the Vizio vs. the Sony. I have zero personal experience with Vizio TVs, so this would be where I would start

- use a price tracker like https://camelcamelcamel.com/Sony-XBR75X850D-Ultra-Smart-75-Inch/product/B01A5LU6R0?context=search to see how similar brand / size TVs perform over time (price-wise). While not a perfect 1:1 correlation, you might be surprised at how quickly that $3,500 becomes $2,400 or less.

Thanks for the feedback. I buy every five years or so, but don't have any big concerns about whether or not I should go bigger. I know I'll be okay with 70"+. It's more so if the quality gap was significant enough between the models.

From what I can tell, the Vizio M series is a pretty good TV for what I'd primarily use it for, movies and gaming. It rated more in line with the Sony 850e though from what I can tell. The Vizio has local dimming while the 850 does not. The 75" 850 is $2,999, though it looks like I just missed a deal where it was $2,499.

Again, thanks for the input and I'll definitely use the price tracker you posted.
 
Yeah Yamaha 681. I run it in surround decode mode on pc so it's possible it's only simulating 5.1.

Based on pg. 68 from your owners manual, that's exactly what's happening.

If you use "Straight decode" where it only plays sound in the channels it's receiving information for, you will likely only get sound from your front left and right.

You likely have the same issue that I do. Which is a bummer.
 

Mrbob

Member
Based on pg. 68 from your owners manual, that's exactly what's happening.

If you use "Straight decode" where it only plays sound in the channels it's receiving information for, you will likely only get sound from your front left and right.

You likely have the same issue that I do. Which is a bummer.



So I guess it's time to delete cru?

Did anyone did a comparisons between identical pieces of media that are encoded with HDR10 and DV?

There is Forbes contributor who has reviewed the two Disposable Me movies and Furious 8 discs hdr10 and DV.

https://www.forbes.com/sites/johnar...nt-despicable-me-1-and-2-review/#30c252ac6994

Link to furious 8 in there too. DV seems more robust overall though the reviewer found some DV issues in darker scenes from Furious 8. DV still new so I'm guessing companies are figuring it out.
 
So I guess it's time to delete cru?

If you want.

I'm not sure what your video card situation is (1080 FE here), but you won't be able to run at resolutions between 1440p and 4K if you do delete it.

That's my issue. With a 1080FE there are plenty of games I can run at near-4K but not quite 4K, so having to go all the way back down to 1440p would be a bummer.

I ordered a display port to HDMI adapter. This enables you to send audio via DP out to the receiver for 5.1 and then run directly to the TV with the primary HDMI, so the audio thing is a non issue.

It's a dumb workaround and means an extra cable, but it should theoretically work.

Again, I know there's some way to fix this in CRU by overriding audio settings, I'm just not fluent enough in software/driver tech to understand it all.
 

Mrbob

Member
If you want.

I'm not sure what your video card situation is (1080 FE here), but you won't be able to run at resolutions between 1440p and 4K if you do delete it.

That's my issue. With a 1080FE there are plenty of games I can run at near-4K but not quite 4K, so having to go all the way back down to 1440p would be a bummer.

I ordered a display port to HDMI adapter. This enables you to send audio via DP out to the receiver for 5.1 and then run directly to the TV with the primary HDMI, so the audio thing is a non issue.

It's a dumb workaround and means an extra cable, but it should theoretically work.

Again, I know there's some way to fix this in CRU by overriding audio settings, I'm just not fluent enough in software/driver tech to understand it all.

I have a 1070 so if this works for you I think I'm going to do this. I've been thinking about running my PC input directly to my TV because I could keep PC specific video settings on a separate HDMI port as I have to different settings to watch movies versus playing games on PC.
 
I keep going back and forth on whether to get the LG C7 or wait for potential HDMI 2.1 TVs. I'll end up having the TV for a minimum of five years. Feels like I'm too worried about future proofing when I shouldn't be.
 
I think that's referring to the latest screwed up FW for the C6, E6 and G6 that enabled HLG support? So no guarantee that anything will change for the B6.

Well shit....there goes my hope. Right now for me it doesn't matter because i only have the Xbox One, no HDR consoles here until November this year. But i hope LG has fixed it by then, because HDR Game mode is bullshit right now for the 65B6. It completely ruins the otherwise bright and vibrant picture this tv provides.
 

BumRush

Member
I keep going back and forth on whether to get the LG C7 or wait for potential HDMI 2.1 TVs. I'll end up having the TV for a minimum of five years. Feels like I'm too worried about future proofing when I shouldn't be.

Significant amounts of content really benefitting from 2.1 seem to be years away (3+, I'd think....not counting specific use cases). Do you have a top end PC for gaming? If not, I wouldn't go crazy.
 

RoadHazard

Gold Member
Significant amounts of content really benefitting from 2.1 seem to be years away (3+, I'd think....not counting specific use cases). Do you have a top end PC for gaming? If not, I wouldn't go crazy.

Uncompressed Dolby Atmos via ARC (from streaming apps on the TV) is a pretty big one that is relevant NOW. It's technically possible with HDMI 2.0 (LG's OLEDs do it), but only with compressed DD+ audio.
 
What about ARC and input lag?

Couple thoughts...

ARC requires HDMI CEC which, almost ten years on or w/e, is still a problematic technology with some of the same issues it's had since day one. Esp when you aren't using the same brand for all your devices, things kind of go to shit honestly.

eARC sounds great on paper but I have zero confidence it will address any of the exisitng ARC issues. Also, I still think ARC is a limited use-case scenario. Right now the only lossless audio that exists isn't on anything that requires ARC. it's on Blu-Ray discs.

Furthermore, we can go down the path of "do you really have speakers that are capable of resolving all the detail that lossless audio produces, versus the standard Dolby Digital Plus bitstream" argument. For a lot of people, it probably makes no difference.

And I'm not sure what you're referring to about input lag.

Uncompressed Dolby Atmos via ARC (from streaming apps on the TV) is a pretty big one that is relevant NOW. It's technically possible with HDMI 2.0 (LG's OLEDs do it), but only with compressed DD+ audio.

And while this is true, there are various boxes that can do full Atmos; Roku Ultra, NV Shield TV etc. They don't do DV just yet, but that's likely more of a matter of time thing, than a "it will never have DV" thing.
 

BumRush

Member
What about ARC and input lag?

Uncompressed Dolby Atmos via ARC (from streaming apps on the TV) is a pretty big one that is relevant NOW. It's technically possible with HDMI 2.0 (LG's OLEDs do it), but only with compressed DD+ audio.

ARC, you're right, but is it something that is worth waiting for outside of some more specific use cases? Not sure and that's up to broncobuster to decide.

Input lag though? What is this in reference to?
 
How did the whole thing start that made LG eventually add a HDR game mode to certain OLED televisions? Because we need to do that for the current HDR Game mode as well. I've read on several forums now too that LG apparently is working on fixing it for the E6, G6 but not a word about the B6 at all...and that is bad, very bad. If they end up fixing it for all these televisions i sure as hell hope they will for the B6 too.
 

RoadHazard

Gold Member
Couple thoughts...

ARC requires HDMI CEC which, almost ten years on or w/e, is still a problematic technology with some of the same issues it's had since day one. Esp when you aren't using the same brand for all your devices, things kind of go to shit honestly.

eARC sounds great on paper but I have zero confidence it will address any of the exisitng ARC issues. Also, I still think ARC is a limited use-case scenario. Right now the only lossless audio that exists isn't on anything that requires ARC. it's on Blu-Ray discs.

Furthermore, we can go down the path of "do you really have speakers that are capable of resolving all the detail that lossless audio produces, versus the standard Dolby Digital Plus bitstream" argument. For a lot of people, it probably makes no difference.

Hmm, yeah, good point.. I guess the likelihood of Netflix (or anyone) starting to stream lossless audio (TrueHD, etc) in the next 5 years is pretty slim? I believe they currently do DD+ (with or without Atmos metadata)? Which is exactly what current TVs can do through ARC. So yeah, that actually is a moot point right now.

What isn't a moot point is whether Sony ever plans on adding this capability to their current TVs, like LG has done. There should be no technical reason why they couldn't. If I buy an Atmos-capable receiver and buy a pair of Atmos modules I don't want that to only be usable with UHD BDs. And I would prefer not to have to buy a separate device for it, when the TV has all the apps and technical ability to do it.

I imagine Variable Refresh Rate, which some say can virtually eliminate input lag. However, if it's already only at 21 ms I'm not sure anyone could notice anything much lower than that.

Are we basically talking G-sync for TVs? Is that something that's likely to happen anytime soon? And even if it is, it would only be usable with PCs until (possibly) the next console generation.
 

BumRush

Member
I imagine Variable Refresh Rate, which some say can virtually eliminate input lag. However, if it's already only at 21 ms I'm not sure anyone could notice anything much lower than that.

Oh okay, yeah maybe because I'm not a competitive gamer but 21 ms is effectively 0 for me.
 
Significant amounts of content really benefitting from 2.1 seem to be years away (3+, I'd think....not counting specific use cases). Do you have a top end PC for gaming? If not, I wouldn't go crazy.

I do, and am planning to build a newer desktop soon, but I mostly play PC games on my monitor so it's not a huge deal. Dolby Atmos isn't a big concern for me as I live in a building with neighbors and won't go big into audio setups.

It's probably not worth waiting for in my situation. Plus even I wait for an LG OLED with 2.1, I'll end up waiting even longer for them to drop in price. I think going with the C7 is my best bet.

Thanks, all.
 
I imagine Variable Refresh Rate, which some say can virtually eliminate input lag. However, if it's already only at 21 ms I'm not sure anyone could notice anything much lower than that.

So the thing that VRR enables is the elimination of screen tearing. Alternatively you can use V sync which waits to "sync up" frames to eliminate tearing. V sync traditionally causes input lag, so yeah, VRR could potentially kind of eliminate it from that perspective (if you care about tearing)

That being said, it's a PC concern. Both PS4 Pro and the upcoming Xbox One X have HDMI 2.0, so there won't necessarily be console support until we better understand if some/any of the feature set for 2.1 is backwards compatible with 2.0.
 

Dosia

Member
How did the whole thing start that made LG eventually add a HDR game mode to certain OLED televisions? Because we need to do that for the current HDR Game mode as well. I've read on several forums now too that LG apparently is working on fixing it for the E6, G6 but not a word about the B6 at all...and that is bad, very bad. If they end up fixing it for all these televisions i sure as hell hope they will for the B6 too.

I'm sure it will get updated. Especially if it was working fine before the fw update.
 

Mrbob

Member
HDMI 2.1 will be interesting, because seems like each new HDMI spec launch is riddled with issues and it takes about a year to clear up. Additionally, I've seen complaints with HDMI cables not handling 18Gbps of bandwidth, and wonder how HDMI cables are going to handle 48GBps. I'll let all the HDMI 2.1 issues play out over the next couple years, and look at a HDMI 2.1 device when the PS5 is out (which I'm guessing is 2020). Hopefully the HDMI 2.1 issues are ironed out by then.

Though I'm more interested in the future for TVs that can start hitting the full specs of Dolby Vision than other HDMI 2.1 features in general.
 

Kyoufu

Member
How did the whole thing start that made LG eventually add a HDR game mode to certain OLED televisions? Because we need to do that for the current HDR Game mode as well. I've read on several forums now too that LG apparently is working on fixing it for the E6, G6 but not a word about the B6 at all...and that is bad, very bad. If they end up fixing it for all these televisions i sure as hell hope they will for the B6 too.

What needs to be fixed?
 

RoadHazard

Gold Member

"You don’t need to buy a pricey flagship to enjoy the benefits of the format."

Oh rly. What cheaper sets (except for those TLCs I can't get here) have DV?

EDIT: Finished the piece now, and that reads like a DV ad, without really managing to say what's so much better about it. Like, it's better I guess? But could also be because of different calibration. It actually makes HDR10 sound pretty good.
 

Mrbob

Member
"You don’t need to buy a pricey flagship to enjoy the benefits of the format."

Oh rly. What cheaper sets (except for those TLCs I can't get here) have DV?

EDIT: Finished the piece now, and that reads like a DV ad, without really managing to say what's so much better about it. Like, it's better I guess? But could also be because of different calibration. It actually makes HDR10 sound pretty good.

I believe the statement is just saying you can get Dolby Vision on a relatively inexpensive set.

Rtings has a little bit more of a breakdown on HDR10 vs Dolby Vison if you want to get deeper on the topic:

http://www.rtings.com/tv/learn/hdr10-vs-dolby-vision

A lot of stuff HDR10+ is going to address has already been figured out for Dolby Vision.

And HDR10 is good. I just watched The Revenant 4k uhd disc last night and thought the HDR effect was great.
 

RedAssedApe

Banned
"You don’t need to buy a pricey flagship to enjoy the benefits of the format."

Oh rly. What cheaper sets (except for those TLCs I can't get here) have DV?

EDIT: Finished the piece now, and that reads like a DV ad, without really managing to say what's so much better about it. Like, it's better I guess? But could also be because of different calibration. It actually makes HDR10 sound pretty good.

Does AVS do sponsored articles? I noticed a big "Brought to you by TCL" banner at the bottom.
 
Hmm, yeah, good point.. I guess the likelihood of Netflix (or anyone) starting to stream lossless audio (TrueHD, etc) in the next 5 years is pretty slim? I believe they currently do DD+ (with or without Atmos metadata)? Which is exactly what current TVs can do through ARC. So yeah, that actually is a moot point right now.

What isn't a moot point is whether Sony ever plans on adding this capability to their current TVs, like LG has done. There should be no technical reason why they couldn't. If I buy an Atmos-capable receiver and buy a pair of Atmos modules I don't want that to only be usable with UHD BDs. And I would prefer not to have to buy a separate device for it, when the TV has all the apps and technical ability to do it.

Yeah I get not wanting a separate device, but depending on the interoperability of your stuff when it comes to HDMI CEC, you may not have a choice. It's hands down the worst part of HDMI.

Also, Roku Ultra does 4K/HDR and Atmos for $100. Not a deal breaker if you've spent money on a TV and Atmos audio system.

Still, you're not wrong. Would be great if it all worked but man, there are all kinds of AVS threads just littered with issues for ARC, so best of luck I guess.

Are we basically talking G-sync for TVs? Is that something that's likely to happen anytime soon? And even if it is, it would only be usable with PCs until (possibly) the next console generation.

Yes, G sync for TVs. And yes, PC now (theoretically) and MS and Sony consoles at some point years from now. Who knows.
 
Top Bottom