• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

LG announces new 8K TVs and.... 48" OLED TV!! OH YEAH BABE

dolabla

Member
Owners starting to appear. This guy just received his 48" CX in the UK today.


There's another owner in that thread too.
 

Jigga117

Member
If you're worried about LG TVs only having 40Gbps instead of 48Gbps, wait for Vizio's new OLED coming later this year.
No one should be worried about it because 48GBps or 40 isn’t going to matter when the loss was 12 bit 4k@120hz. The Vizio is also only a 10-bit panel like every other display out there. This is really a mute point and shouldn’t be the decision maker in anyone’s purchase. Nothing was lost
 

I have a 49" LED TV which fits me just fine, a 48" OLED screen would be amazing. Should be fairly cheaper than their 55" counterparts.

that 48" oled will make a great gaming monitor too.
 

JohnnyFootball

GerAlt-Right. Ciriously.
No one should be worried about it because 48GBps or 40 isn’t going to matter when the loss was 12 bit 4k@120hz. The Vizio is also only a 10-bit panel like every other display out there. This is really a mute point and shouldn’t be the decision maker in anyone’s purchase. Nothing was lost
This is mostly correct. It's not a huge loss, but still I would imagine a 12-bit color signal downsampled to 10-bit would look a tad better than a 10-bit to 10-bit.

Still it's a loss I could live with.
 

JohnnyFootball

GerAlt-Right. Ciriously.
not without display port since nvidia don't support freesync through hdmi sadly :(
LG OLEDs support GSync through HDMI. Freesync support is completely irrelevant. It's been there for quite some time.

 
Last edited:

DeepEnigma

Gold Member
This is mostly correct. It's not a huge loss, but still I would imagine a 12-bit color signal downsampled to 10-bit would look a tad better than a 10-bit to 10-bit.

Still it's a loss I could live with.

You’re not getting a 12-bit color sample since everything is outputting at 10-bit to match the most common HDR standards, correct?
 
Last edited:

rofif

Can’t Git Gud
LG OLEDs support GSync. Freesync support is completely irrelevant. It's been there for quite some time.

Hell yeah!
So there is gsync for nvidia and freesync on consoles? Hell f yeah. I must get one....
Although I've not used tv as a monitor ever and I've never used real hdr tv. I did had a LOT of monitors. Currently 4k lg 27uk650 which is great but still a filthy ips. Although 48" is a lot for a monitor while I find 27" perfect
 
Last edited:
If you're worried about LG TVs only having 40Gbps instead of 48Gbps, wait for Vizio's new OLED coming later this year.

I dont mean to derail the thread but I wonder when Vizio plans to launch their 2020 lineup. Typically it is around this time in May or late spring but I have heard nothing.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Hell yeah!
So there is gsync for nvidia and freesync on consoles? Hell f yeah. I must get one....
Glad you learned something today.

Yes, my PC is hooked up to my OLED and the Nvidia Control Panel recognizes my OLED as a GSync display. Since my 2080 Ti is only HDMI 2.0, I am limited to only 4K/60Hz or 1440p/120Hz. Once graphics cards support HDMI 2.1 I should be able to utilize full 10-bit HDR 4K/120Hz through HDMI.

In regards to current consoles:
Xbox One X supports HDMI VRR and FreeSync. My LG OLED allows me to use VRR. It's a mostly useless feature on console though
PS4 Pro does not support VRR and gets no benefit from an HDMI 2.1.
XSX has been confirmed to have HDMI 2.1 support
PS5 is expect to, but has not been officially confirmed.
 
Last edited:
D

Deleted member 17706

Unconfirmed Member
Tempted to get that 48" and turn it into an ultrawide PC monitor, but considering the burn in I already see on my 2016 LG B6 model, which I didn't use as a PC monitor at all, it's probably a really bad idea.
 

mitchman

Gold Member
Owners starting to appear. This guy just received his 48" CX in the UK today.


There's another owner in that thread too.
I think I might get one next week, not decided yet. I have an UW monitor now and a 65" B9 as my main TV, so this'll be as a monitor to replace the UW. Problem is I also ordered a new Ryzen 9 4900H/32GB/1TB SSD laptop so not sure how wise it is to go completely bonkers now :) We'll see. I so want it though.
 

Kimd41

Neo Member
Has the input lag been improved? I tried my C9 as monitor in PC mode + game mode and it was still terrible, even moving the mouse was horrible.
 

Jigga117

Member
Hell yeah!
So there is gsync for nvidia and freesync on consoles? Hell f yeah. I must get one....
Although I've not used tv as a monitor ever and I've never used real hdr tv. I did had a LOT of monitors. Currently 4k lg 27uk650 which is great but still a filthy ips. Although 48" is a lot for a monitor while I find 27" perfect
There is VRR just because it is HDMI 2.1 so it doesn’t matter. The Xbox One X has VRR yet the C9 doesn’t support freesynce at this point
 

Jigga117

Member
Has the input lag been improved? I tried my C9 as monitor in PC mode + game mode and it was still terrible, even moving the mouse was horrible.
I have been using the 77c9 for a couple of weeks and the C9 55 since February. I have had no issues with any input lag
 

dolabla

Member
I think I might get one next week, not decided yet. I have an UW monitor now and a 65" B9 as my main TV, so this'll be as a monitor to replace the UW. Problem is I also ordered a new Ryzen 9 4900H/32GB/1TB SSD laptop so not sure how wise it is to go completely bonkers now :) We'll see. I so want it though.

Definitely post back in this thread if you do. They aren't out here in the US yet.
 

Ulysses 31

Member
My main monitor is a 43" Philips which is 60hz. There must be something wrong with my settings or cable, Im not here to bash the OLED I love it.

I really want an OLED monitor but this bad experience worries me...
Did you change to 120 Hz on PC? >.>
 

Venuspower

Member
So there is gsync for nvidia and freesync on consoles? Hell f yeah. I must get one....

The VRR Stuff is a bit more complex.
For HDMI there are two standards
- HDMI Forum VRR
- FreeSync via HDMI

The only difference between HDMI Forum VRR and FreeSync is how they are advertised in the EDID (which is why LG OLEDs need an update to support "FreeSync via HDMI). But you can modify the EDID using CRU to enable FreeSync even today on a LG CX or C9. Thus showing that the EDID is the only difference. But without CRU/EDID modifaction it would not be able to use FreeSync via HDMI until LG releases an update. Which is why it is still important to differentiate between FreeSync via HDMI and HDMI Forum VRR even though they are nearly identical.

FreeSync via HDMI has been devloped by AMD a few years ago because HDMI did not had a specification for VRR at that
point. With HDMI 2.1 this has changed and HDMI Forum VRR became the standard VRR format over HDMI. Thus making "FreeSync via HDMI"
irrelevant in the future (at least if you do not want to use old GPUs). Both standards are inspired by VESAs adaptive Sync. But in theory there is
no need for FreeSync via HDMI anymore because HDMI now has an official specification that will be used by pretty much every company going forward.

However, Microsoft is still advertising support for HDMI VRR and FreeSync via HDMI for their Series X. This is probably because they
want to add legacy support for older monitors with FreeSync via HDMI support. Sony on the other hand only metions this:
"VRR (specified by HDMI ver.2.1)" on their PSBlog. So it might be possible that PS5 won't be able to use VRR on "FreeSync via HDMI" monitors. Unless you modify the EDID. But there are no external tools right now that are able to do it.

What NVIDIA did is: They simply added support for HDMI Forum VRR. That is it. Then they put their "G-Sync" lable on it.
As they did in January 2019 when they added support for Adaptive Sync via DisplayPort. If you are now wondering why LG had to release an
update for their 2019 lineup: This update simply allowed the NVIDIA driver to detect LG TVs as "G-Sync compatible". Without that update you had to enable G-Sync manually. But even then it worked just fine.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Has the input lag been improved? I tried my C9 as monitor in PC mode + game mode and it was still terrible, even moving the mouse was horrible.
There is no input lag to improve. The LG OLEDs have some of the lowest input lag imaginable. Something is definitely wrong on your end.
The TV should indicate when you start a game whether you are in Auto Low Latency Mode. Once your in low latency mode you should be getting input lag around 10ms.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
The VRR Stuff is a bit more complex.
For HDMI there are two standards
- HDMI Forum VRR
- FreeSync via HDMI

The only difference between HDMI Forum VRR and FreeSync is how they are advertised in the EDID (which is why LG OLEDs need an update to support "FreeSync via HDMI). But you can modify the EDID using CRU to enable FreeSync even today on a LG CX or C9. Thus showing that the EDID is the only difference. But without CRU/EDID modifaction it would not be able to use FreeSync via HDMI until LG releases an update. Which is why it is still important to differentiate between FreeSync via HDMI and HDMI Forum VRR even though they are nearly identical.

FreeSync via HDMI has been devloped by AMD a few years ago because HDMI did not had a specification for VRR at that
point. With HDMI 2.1 this has changed and HDMI Forum VRR became the standard VRR format over HDMI. Thus making "FreeSync via HDMI"
irrelevant in the future (at least if you do not want to use old GPUs). Both standards are inspired by VESAs adaptive Sync. But in theory there is
no need for FreeSync via HDMI anymore because HDMI now has an official specification that will be used by pretty much every company going forward.

However, Microsoft is still advertising support for HDMI VRR and FreeSync via HDMI for their Series X. This is probably because they
want to add legacy support for older monitors with FreeSync via HDMI support. Sony on the other hand only metions this:
"VRR (specified by HDMI ver.2.1)" on their PSBlog. So it might be possible that PS5 won't be able to use VRR on "FreeSync via HDMI" monitors. Unless you modify the EDID. But there are no external tools right now that are able to do it.

What NVIDIA did is: They simply added support for HDMI Forum VRR. That is it. Then they put their "G-Sync" lable on it.
As they did in January 2019 when they added support for Adaptive Sync via DisplayPort. If you are now wondering why LG had to release an
update for their 2019 lineup: This update simply allowed the NVIDIA driver to detect LG TVs as "G-Sync compatible". Without that update you had to enable G-Sync manually. But even then it worked just fine.
LG OLEDs are certified and validated for GSync. Most freesync monitors will just say GSync compatible but not validated.

VRR on X1X has very little use since most titles are locked to 30 or 60.
 

Venuspower

Member
LG OLEDs are certified and validated for GSync. Most freesync monitors will just say GSync compatible but not validated.

Yea. The certification process is just an extra setp that NVIDIA goes.
But the technology behind "G-Sync via HDMI" is still HDMI Forum VRR.
In theory every display with a proper HDMI VRR implementation should be
able to use VRR with a compatible G-Sync GPU without any problems. But for some reason this is not always the case.
For example: While you are able to enable "G-Sync" with a Samsung QLED TV it will start flickering
right after enabling G-Sync. It is not known what is causing this problems. Maybe a bad VRR implementation on Samsungs side
or maybe NVIDIA is doing some stuff to prevent a flicker-free VRR experience. Who knows...
 

JohnnyFootball

GerAlt-Right. Ciriously.
Yea. The certification process is just an extra setp that NVIDIA goes.
But the technology behind "G-Sync via HDMI" is still HDMI Forum VRR.
In theory every display with a proper HDMI VRR implementation should be
able to use VRR with a compatible G-Sync GPU without any problems. But for some reason this is not always the case.
For example: While you are able to enable "G-Sync" with a Samsung QLED TV it will start flickering
right after enabling G-Sync. It is not known what is causing this problems. Maybe a bad VRR implementation on Samsungs side
or maybe NVIDIA is doing some stuff to prevent a flicker-free VRR experience. Who knows...
I am aware of that. But so far the LG OLEDS are the only HDMI TV/monitors that Nvidia has certified as supporting GSync. I agree, there is no reason to not support Samsung and Sony's HDMI 2.1 as being GSync, but LG (wisely) probably has a marketing deal with them. It's a pretty big deal. In fact that deal alone pushed me to get my C9 a little bit sooner than I anticipated. I suspect that those TVs will just have to wait until nvidia and AMD release full bandwidth HDMI 2.1 GPUs.

It's just a shame that my Geforce 2080 Ti can't utilize the full bandwidth of HDMI 2.1.
 
Last edited:

Venuspower

Member
It's just a shame that my Geforce 2080 Ti can't utilize the full bandwidth of HDMI 2.1.

Club3D is going to realease a DP 1.4 => HDMI 2.1 really soon.
June is the latest information we got.

But it might be possible that the first revision of the adapter won't be able to support HDMI VRR even though
they mention it in their catalogue:

iD4FwZ8.png
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Club3D is going to realease a DP 1.4 => HDMI 2.1 really soon.
June is the latest information we got.

But it might be possible that the first revision of the adapter won't be able to support HDMI VRR even though
they mention it in their catalogue:

iD4FwZ8.png
There is also no guarantee that this will work. I'd be curious to see the testing. Display port 1.4 is still limited to 25 Gbps, which does not come remotely close to the 40 GBps needed for 4K/120 10-bit HDR 4:4:4/RGB. 25GBps, gets you somewhere in the ballpark of 4K/ 80 fps at 444, which I would be OK for the time being since my 2080 Ti doesn't really run all that much stuff ate framerates that high.

I am aware that Display Stream Compression can compensate, but that feature has never been implemented into any current monitor and/or display. I am very surprised too since it seems like a feature many would love to take advantage of. My gut feeling is that the reason we haven't seen it is that it doesn't produce results as good as expected and/or causes major input lag.

So far this is the only one I know of that has been announced.
 
Last edited:

Venuspower

Member
I am aware that Display Stream Compression can compensate, but that feature has never been implemented into any current monitor and/or display. I am very surprised too since it seems like a feature many would love to take advantage of. My gut feeling is that the reason we haven't seen it is that it doesn't produce results as good as expected and/or causes major input lag.

As far as I understand DSC will only be used between the adapter and the GPU.
Which is why it does not matter if the TV supports it or not.
The adapter will then uncompress the signal and output it over HDMI 2.1 48 GBit/s.

Since DSC is a lossless compression there should not be any downside in using DSC between Adapter and GPU.
 

JohnnyFootball

GerAlt-Right. Ciriously.
As far as I understand DSC will only be used between the adapter and the GPU.
Which is why it does not matter if the TV supports it or not.
The adapter will then uncompress the signal and output it over HDMI 2.1 48 GBit/s.

Since DSC is a lossless compression there should not be any downside in using DSC between Adapter and GPU.
HDMI 2.1 actually supports DSC.

But still I am very hesitant to believe that it wont have some side effects.
 

Venuspower

Member
HDMI 2.1 actually supports DSC.

But still I am very hesitant to believe that it wont have some side effects.

At least it is mentioned in the speficiations.
But that does not mean that manufacturers have to implement it.
LG, Samsung etc. do not support DSC on their televisions.

But as I said. DSC will only be used between the adapter and the GPU.
Which is why it does not matter if the display supports DSC.
 

JohnnyFootball

GerAlt-Right. Ciriously.
At least it is mentioned in the speficiations.
But that does not mean that manufacturers have to implement it.
LG, Samsung etc. do not support DSC on their televisions.

But as I said. DSC will only be used between the adapter and the GPU.
Which is why it does not matter if the display supports DSC.
Do you have a link to the product page from Club 3d? I can't find it. I'd love to know more about it. I am VERY skeptical about it working well. if it does, it would make it easy to forgo getting an HDMI 2.1 video card right away.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I'm not so sure since DP 1.4 is 32.4 GB/s.
Re-read the posts between myself and Venuspower.
You will see that if the adapter properly utilizes Display Stream Compression (DSC) then DisplayPort (DP) could easily provide enough bandwidth for a full 48GBps HDMI 2.1 signal.

Its a little known fact that DP 1.3 (and now HDMI 2.1) has support for DSC, which allows for data that is of significantly higher bandwidth than the 25(not 32.4)GBps DP can push through. DSC allows for bandwidth ~50GBps (2X) - 75Gbps (3X) to be pushed through a single 25GBps DisplayPort. There is 32.4GBps available on DP, but only 25Gbps is usable for video.

This feature was announced back in 2014 and has yet to make it into any sort of monitor on the market. Asus announced a monitor for it last year. I am not aware of any others.

Since this feature has been around since DP 1.3 I am very puzzled as to why it has never been utilized as it seems like it would be a very compelling feature to add into a monitor and would potentially provide years of future proofing. Maybe that is why. Even on high end $1500+ expensive monitors the feature has been absent.

In theory DSC should have no effect on image quality and/or input lag, but I remain very skeptical. There has to be a reason it has never been implemented. With DSC, a single DP cable would allow for not only 4K/120, but even higher 4K/240 at full 10-bit!!!!!!!! The only explanation is that the feature doesn't work as well and there are penalties that make it not worth it.
 

Ulysses 31

Member
Re-read the posts between myself and Venuspower.
You will see that if the adapter properly utilizes Display Stream Compression (DSC) then DisplayPort (DP) could easily provide enough bandwidth for a full 48GBps HDMI 2.1 signal.
And if it doesn't support DSC, you'd still want to use that adapter?
 

JohnnyFootball

GerAlt-Right. Ciriously.
And if it doesn't support DSC, you'd still want to use that adapter?
Dude, the whole paragraph is about the adapter utilizing DSC to get full 4K/120Hz. That's the entire point of the product existing!
iD4FwZ8.png


To answer your question, if it doesn't do what it claims to do, then no. It's worthless and I wouldn't waste my money. Im going to wait for reviews and tests before I drop any money on it.
 
How deep is your desk :messenger_neutral:

I have an L shape desk. it's 27 inches deep. I can put on my second desk and have monitors on main desk. Just roll back about 3 feet and it will be golden. with a 50 inch tv you need to be within 5 to 7 feet of it.

 
Last edited:

saintjules

Member
CX C9 review comparison Rtings


Yep, great video. They compare the two C's against the Samsung Q90R and Sony A8G as well.

The C9 is the better option due to little to no difference in the picture quality imo. For those worried somehow by the 40Gbps on the CX versus the 48Gbps C9 that's also another reason to get the C9. I don't think this will be a cause for concern when gaming on next-gen Consoles.
 

Venuspower

Member
Do you have a link to the product page from Club 3d? I can't find it. I'd love to know more about it. I am VERY skeptical about it working well. if it does, it would make it easy to forgo getting an HDMI 2.1 video card right away.

There is nothing on the product page at the moment:

But you can find the adapter in their 2020 catalog:

https://club-3d.com/id=8/openpdfnews/6_1/pdf/club3d_catalog_2020_web.pdf/ (direct download!)
 
I have an L shape desk. it's 27 inches deep. I can put on my second desk and have monitors on main desk. Just roll back about 3 feet and it will be golden. with a 50 inch tv you need to be within 5 to 7 feet of it.


ah so another case of

*hangs bigger tv*
*moves couch back a bit more*

Ah yeah, the 5-yearly cycle
 

bargeparty

Member
ah so another case of

Dunno if I agree with the rtings calc. No way I'm sitting 9ft from a 65 in tv.

edit: I just looked at the slider at the top, they have more information below with a totally separate chart/reference that seems better. (not sure what the point of the slider is then...)
 
Last edited:
Top Bottom