Jigga117
Member
I think you meant to respond to the other guy below my post, lol. No concern here. I'm excited.
my apologies
I think you meant to respond to the other guy below my post, lol. No concern here. I'm excited.
No one should be worried about it because 48GBps or 40 isn’t going to matter when the loss was 12 bit 4k@120hz. The Vizio is also only a 10-bit panel like every other display out there. This is really a mute point and shouldn’t be the decision maker in anyone’s purchase. Nothing was lostIf you're worried about LG TVs only having 40Gbps instead of 48Gbps, wait for Vizio's new OLED coming later this year.
LG's 2020 TVs: Massive 8K screens and the first 48-inch 4K OLED
Prior to LG's CES 2020 press conference this morning, the company had already revealed a slew of "Real 8K" televisions, and announced that the rollable 4K OLED TV we saw at last year's show will be ready to go on sale later this year. Both of those were present in its demo area, and looked...www.engadget.com
I have a 49" LED TV which fits me just fine, a 48" OLED screen would be amazing. Should be fairly cheaper than their 55" counterparts.
not without display port since nvidia don't support freesync through hdmi sadlythat 48" oled will make a great gaming monitor too.
This is mostly correct. It's not a huge loss, but still I would imagine a 12-bit color signal downsampled to 10-bit would look a tad better than a 10-bit to 10-bit.No one should be worried about it because 48GBps or 40 isn’t going to matter when the loss was 12 bit 4k@120hz. The Vizio is also only a 10-bit panel like every other display out there. This is really a mute point and shouldn’t be the decision maker in anyone’s purchase. Nothing was lost
LG OLEDs support GSync through HDMI. Freesync support is completely irrelevant. It's been there for quite some time.not without display port since nvidia don't support freesync through hdmi sadly
This is mostly correct. It's not a huge loss, but still I would imagine a 12-bit color signal downsampled to 10-bit would look a tad better than a 10-bit to 10-bit.
Still it's a loss I could live with.
Hell yeah!LG OLEDs support GSync. Freesync support is completely irrelevant. It's been there for quite some time.
Nvidia and LG Debut 12 New G-Sync Compatible OLED TVs at CES 2020 | Digital Trends
Nvidia and LG announced a range of brand new OLED TVs at CES 2020. Available in sizes ranging from 48 to 88-inches, these new screens are rated as G-Sync compatible. While that means they don't have all the features of G-Sync Ultimate screens, screen tearing should be a thing of the past.www.digitaltrends.com
If you're worried about LG TVs only having 40Gbps instead of 48Gbps, wait for Vizio's new OLED coming later this year.
Glad you learned something today.Hell yeah!
So there is gsync for nvidia and freesync on consoles? Hell f yeah. I must get one....
I think I might get one next week, not decided yet. I have an UW monitor now and a 65" B9 as my main TV, so this'll be as a monitor to replace the UW. Problem is I also ordered a new Ryzen 9 4900H/32GB/1TB SSD laptop so not sure how wise it is to go completely bonkers now We'll see. I so want it though.Owners starting to appear. This guy just received his 48" CX in the UK today.
48" OLED: LG OLED48CX5LC or Sony KD48A9BU?
And is this data available to see? You have to be a member, which I believe is 25 dollars a year. This is how they get their 27 million yearly budget for research. They test and review about 250 TVs a year. For instance, they will test every size of the CX except for the 88" model. If your...www.avforums.com
There's another owner in that thread too.
I keep my monitors hooked up to my desktop for casual work and switch to OLED for gaming time!Tempted to get that 48" and turn it into an ultrawide PC monitor, but considering the burn in I already see on my 2016 LG B6 model, which I didn't use as a PC monitor at all, it's probably a really bad idea.
There is VRR just because it is HDMI 2.1 so it doesn’t matter. The Xbox One X has VRR yet the C9 doesn’t support freesynce at this pointHell yeah!
So there is gsync for nvidia and freesync on consoles? Hell f yeah. I must get one....
Although I've not used tv as a monitor ever and I've never used real hdr tv. I did had a LOT of monitors. Currently 4k lg 27uk650 which is great but still a filthy ips. Although 48" is a lot for a monitor while I find 27" perfect
I have been using the 77c9 for a couple of weeks and the C9 55 since February. I have had no issues with any input lagHas the input lag been improved? I tried my C9 as monitor in PC mode + game mode and it was still terrible, even moving the mouse was horrible.
not without display port since nvidia don't support freesync through hdmi sadly
I see. I've just got my 2070 last year and it was quite expensive for being just gpu so not looking forward to spending even more soon Maybe in 2 yearsI agree, but hopefully we will have new cards this year that support HDMI 2.1
I keep my monitors hooked up to my desktop for casual work and switch to OLED for gaming time!
I think I might get one next week, not decided yet. I have an UW monitor now and a 65" B9 as my main TV, so this'll be as a monitor to replace the UW. Problem is I also ordered a new Ryzen 9 4900H/32GB/1TB SSD laptop so not sure how wise it is to go completely bonkers now We'll see. I so want it though.
Has the input lag been improved? I tried my C9 as monitor in PC mode + game mode and it was still terrible, even moving the mouse was horrible.
My main monitor is a 43" Philips which is 60hz. There must be something wrong with my settings or cable, Im not here to bash the OLED I love it.13ms is horrible? Sounds like you're cursed to a life of 240hz TN monitors bro.
Did you change to 120 Hz on PC? >.>My main monitor is a 43" Philips which is 60hz. There must be something wrong with my settings or cable, Im not here to bash the OLED I love it.
I really want an OLED monitor but this bad experience worries me...
You don’t even have to do that. Somthing maybe off on their settings. I can be in any mode for game mode/resolution with no issues on either tvDid you change to 120 Hz on PC? >.>
So there is gsync for nvidia and freesync on consoles? Hell f yeah. I must get one....
There is no input lag to improve. The LG OLEDs have some of the lowest input lag imaginable. Something is definitely wrong on your end.Has the input lag been improved? I tried my C9 as monitor in PC mode + game mode and it was still terrible, even moving the mouse was horrible.
LG OLEDs are certified and validated for GSync. Most freesync monitors will just say GSync compatible but not validated.The VRR Stuff is a bit more complex.
For HDMI there are two standards
- HDMI Forum VRR
- FreeSync via HDMI
The only difference between HDMI Forum VRR and FreeSync is how they are advertised in the EDID (which is why LG OLEDs need an update to support "FreeSync via HDMI). But you can modify the EDID using CRU to enable FreeSync even today on a LG CX or C9. Thus showing that the EDID is the only difference. But without CRU/EDID modifaction it would not be able to use FreeSync via HDMI until LG releases an update. Which is why it is still important to differentiate between FreeSync via HDMI and HDMI Forum VRR even though they are nearly identical.
FreeSync via HDMI has been devloped by AMD a few years ago because HDMI did not had a specification for VRR at that
point. With HDMI 2.1 this has changed and HDMI Forum VRR became the standard VRR format over HDMI. Thus making "FreeSync via HDMI"
irrelevant in the future (at least if you do not want to use old GPUs). Both standards are inspired by VESAs adaptive Sync. But in theory there is
no need for FreeSync via HDMI anymore because HDMI now has an official specification that will be used by pretty much every company going forward.
However, Microsoft is still advertising support for HDMI VRR and FreeSync via HDMI for their Series X. This is probably because they
want to add legacy support for older monitors with FreeSync via HDMI support. Sony on the other hand only metions this:
"VRR (specified by HDMI ver.2.1)" on their PSBlog. So it might be possible that PS5 won't be able to use VRR on "FreeSync via HDMI" monitors. Unless you modify the EDID. But there are no external tools right now that are able to do it.
What NVIDIA did is: They simply added support for HDMI Forum VRR. That is it. Then they put their "G-Sync" lable on it.
As they did in January 2019 when they added support for Adaptive Sync via DisplayPort. If you are now wondering why LG had to release an
update for their 2019 lineup: This update simply allowed the NVIDIA driver to detect LG TVs as "G-Sync compatible". Without that update you had to enable G-Sync manually. But even then it worked just fine.
Try not using PC mode, I seem to remember some issues with that one. Use just game mode.Has the input lag been improved? I tried my C9 as monitor in PC mode + game mode and it was still terrible, even moving the mouse was horrible.
LG OLEDs are certified and validated for GSync. Most freesync monitors will just say GSync compatible but not validated.
I am aware of that. But so far the LG OLEDS are the only HDMI TV/monitors that Nvidia has certified as supporting GSync. I agree, there is no reason to not support Samsung and Sony's HDMI 2.1 as being GSync, but LG (wisely) probably has a marketing deal with them. It's a pretty big deal. In fact that deal alone pushed me to get my C9 a little bit sooner than I anticipated. I suspect that those TVs will just have to wait until nvidia and AMD release full bandwidth HDMI 2.1 GPUs.Yea. The certification process is just an extra setp that NVIDIA goes.
But the technology behind "G-Sync via HDMI" is still HDMI Forum VRR.
In theory every display with a proper HDMI VRR implementation should be
able to use VRR with a compatible G-Sync GPU without any problems. But for some reason this is not always the case.
For example: While you are able to enable "G-Sync" with a Samsung QLED TV it will start flickering
right after enabling G-Sync. It is not known what is causing this problems. Maybe a bad VRR implementation on Samsungs side
or maybe NVIDIA is doing some stuff to prevent a flicker-free VRR experience. Who knows...
It's just a shame that my Geforce 2080 Ti can't utilize the full bandwidth of HDMI 2.1.
There is also no guarantee that this will work. I'd be curious to see the testing. Display port 1.4 is still limited to 25 Gbps, which does not come remotely close to the 40 GBps needed for 4K/120 10-bit HDR 4:4:4/RGB. 25GBps, gets you somewhere in the ballpark of 4K/ 80 fps at 444, which I would be OK for the time being since my 2080 Ti doesn't really run all that much stuff ate framerates that high.Club3D is going to realease a DP 1.4 => HDMI 2.1 really soon.
June is the latest information we got.
But it might be possible that the first revision of the adapter won't be able to support HDMI VRR even though
they mention it in their catalogue:
I am aware that Display Stream Compression can compensate, but that feature has never been implemented into any current monitor and/or display. I am very surprised too since it seems like a feature many would love to take advantage of. My gut feeling is that the reason we haven't seen it is that it doesn't produce results as good as expected and/or causes major input lag.
HDMI 2.1 actually supports DSC.As far as I understand DSC will only be used between the adapter and the GPU.
Which is why it does not matter if the TV supports it or not.
The adapter will then uncompress the signal and output it over HDMI 2.1 48 GBit/s.
Since DSC is a lossless compression there should not be any downside in using DSC between Adapter and GPU.
HDMI 2.1 actually supports DSC.
But still I am very hesitant to believe that it wont have some side effects.
Do you have a link to the product page from Club 3d? I can't find it. I'd love to know more about it. I am VERY skeptical about it working well. if it does, it would make it easy to forgo getting an HDMI 2.1 video card right away.At least it is mentioned in the speficiations.
But that does not mean that manufacturers have to implement it.
LG, Samsung etc. do not support DSC on their televisions.
But as I said. DSC will only be used between the adapter and the GPU.
Which is why it does not matter if the display supports DSC.
I'm not so sure since DP 1.4 is 32.4 GB/s.if it does, it would make it easy to forgo getting an HDMI 2.1 video card right away.
Re-read the posts between myself and Venuspower.I'm not so sure since DP 1.4 is 32.4 GB/s.
And if it doesn't support DSC, you'd still want to use that adapter?Re-read the posts between myself and Venuspower.
You will see that if the adapter properly utilizes Display Stream Compression (DSC) then DisplayPort (DP) could easily provide enough bandwidth for a full 48GBps HDMI 2.1 signal.
How deep is your deskthat 48" oled will make a great gaming monitor too.
Dude, the whole paragraph is about the adapter utilizing DSC to get full 4K/120Hz. That's the entire point of the product existing!And if it doesn't support DSC, you'd still want to use that adapter?
How deep is your desk
CX C9 review comparison Rtings
Do you have a link to the product page from Club 3d? I can't find it. I'd love to know more about it. I am VERY skeptical about it working well. if it does, it would make it easy to forgo getting an HDMI 2.1 video card right away.
I have an L shape desk. it's 27 inches deep. I can put on my second desk and have monitors on main desk. Just roll back about 3 feet and it will be golden. with a 50 inch tv you need to be within 5 to 7 feet of it.
TV Size to Distance Calculator and Science
Our TV Sizes to Distance Calculator helps you choose the right size TV for your space. The optimal viewing distance is about 1.6 times the diagonal length of the television. For example, for a 55” TV, the best distance is 7 feet.www.rtings.com
*hangs bigger tv*
*moves couch back a bit more*
Ah yeah, the 5-yearly cycle
ah so another case of