• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

CES: Nvidia brings G-Sync support to FreeSync monitors

TacosNSalsa

Member
Last edited:

dirthead

Banned
Okay so I tested this out with my MSI Optix G27C2 and it works pretty good IF I disconnect the HDMI cable I have going to my capture card . If I leave it connected it will cap at 60 Hz even though I'm connected to my 144hz monitor through display port . Messing around with stuff to see if I cam find a remedy but maybe that's just how it works

It's sad how fucked up cabling is on PCs. It needs to be cleaned up so badly. I really, really hope that HDMI 2.1 adoption gets pushed through fast and we completely get rid of display port. I just want HDMI -> receiver --> display like a normal setup. I really don't know who these companies are kidding. HDMI won. Dedicated computer speakers were always stupid. All that crap needs to be standardized.
 

Alexios

Cores, shaders and BIOS oh my!
do you have to install geforce experience to get the freesync support?
No, just the driver/control panel.
1tukk8.png

I got the cheapest monitor from nvidia's list and I'm pretty happy. It seems to work as intended (no flickering or tearing or whatever) but despite upgrading from 60hz to 144hz and gsync I wouldn't say it's the revolutionary experience people hyped it all up to be. Then again I haven't done a direct comparison to see how the frame drops show when I have regular/triple buffered vsync enabled to see if it seems that much choppier now. Still, I can definitely tell when the frame rate goes below 60 fps as before (that seems to be my threshold, anything above and I really can't tell for now). So, the biggest upgrade to me turned out to be going from 1080p to 1440p and at a slightly larger size (but not too large either, I like to keep the ppi dense). I wouldn't say it was a worthwhile upgrade if my last monitor hadn't been dying but as I needed to buy a new one it worked out.
 
Last edited:

Silver Wattle

Gold Member
No, just the driver/control panel.
1tukk8.png

I got the cheapest monitor from nvidia's list and I'm pretty happy. It seems to work as intended (no flickering or tearing or whatever) but despite upgrading from 60hz to 144hz and gsync I wouldn't say it's the revolutionary experience people hyped it all up to be. Then again I haven't done a direct comparison to see how the frame drops show when I have regular/triple buffered vsync enabled to see if it seems that much choppier now. Still, I can definitely tell when the frame rate goes below 60 fps as before (that seems to be my threshold, anything above and I really can't tell for now). So, the biggest upgrade to me turned out to be going from 1080p to 1440p and at a slightly larger size (but not too large either, I like to keep the ppi dense). I wouldn't say it was a worthwhile upgrade if my last monitor hadn't been dying but as I needed to buy a new one it worked out.
Do you have v sync enabled?
 

Bolivar687

Banned
This thread made me remember that I own a G-Sync monitor, and I won't be returning to the gentle, prosumer embrace of mother AMD anytime soon... :messenger_loudly_crying:
 

SatansReverence

Hipster Princess
Ended up picking up an Asus VG278Q and couldn't be happier. 120-140fps in BFV and not a single torn frame.

When playing cities skylines with frames swinging from 60 to 30 it still plays pretty smooth especially above 40. The framerate is noticeable but there is alot less stutter and no tearing is baller.
 

Alexios

Cores, shaders and BIOS oh my!
Do you have v sync enabled?
No, I know it has to be off for gsync, the monitor itself also correctly reports its v. frequency as freesync rather than a set 144Hz or whatever, like I said I haven't gone back to compare gsync vs vsync to see if the downgrade is now more visible than the upgrade was when first setting it all up.
 
Last edited:
There's only one g-sync, and that's g-sync with a module on the display side. The rest is simply Vesa's adaptive sync or HDMI's VRR. It doesn't matter what label Nvidia attaches to it. Don't be fooled by clever marketing.

If Nvidia cards don't support DisplayHDR1000 with adaptive sync on a monitor without a g-sync module then the limitation lies with Nvidia. Not with the monitor or the connection standard.
Ok, I was going to say, I thought there was a physical reason that G-sync monitors were more expensive.
 
No, I know it has to be off for gsync, the monitor itself also correctly reports its v. frequency as freesync rather than a set 144Hz or whatever, like I said I haven't gone back to compare gsync vs vsync to see if the downgrade is now more visible than the upgrade was when first setting it all up.

v sync should be enabled globally through the nvidia control panel

they really do a shitty of job coming up with a guide telling you how to set it all up
 

Alexios

Cores, shaders and BIOS oh my!
v sync should be enabled globally through the nvidia control panel

they really do a shitty of job coming up with a guide telling you how to set it all up
So with these settings how does the monitor itself correctly report it's running in freesync rather than just its chosen Hz and I don't get any tearing or anything I should notice if I've merely been running with vsync disabled and no gsync to speak of?
 
Last edited:
So with these settings how does the monitor itself correctly report it's running in freesync rather than just its chosen Hz and I don't get any tearing or anything I should notice if I've merely been running with vsync disabled and no gsync to speak of?

for freesync, im not sure, as i have a g sync monitor. but i would assume it is the same process?

enable g sync in the nvidia control panel
enable v sync globally in the nvidia control panel
cap your frame rate to your monitors refresh rate - 1 using RTSS (i have a 120hz ultrawide, so i cap at 119)
disable v sync in whatever game you are trying to play

this article explains it a bit:
https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/

they mention refresh rate minus 3, i've always heard refresh rate minus 1 which i what i've been using without issues
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
Yeah I was googling again and found that article. I guess gsync appears to work either way but when you have vsync enabled and at the same time happen to exceed the frame rate range where gsync actually works that's when vsync does its thing as usual. Or something, if I get this right.

I guess I haven't played games where I do exceed that gsync range yet to not notice anything wrong with it. I already used RTSS to limit the framerate to a bit less than my max refresh rate as I had read about that part before. I'll enable vsync now but I doubt I'll notice a difference given this.
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
I don't have that "Set up G-SYNC option at all.

My monitor is a 240hz freesync one.

Is it because of the HDMI? The cable allows me to have 240hz with no problems.
Probably, I use a displayport cable but I was going to do that anyway, I need my GPU's one HDMI output for my VR set. And there's also a bug (?) where when I connect another active device to HDMI (so not when VR is off) it disables gsync on my primary displayport monitor too.

I think that when I first used the HDMI cable I couldn't even select the maximum resolution/refresh rate of the monitor even outside any gsync shenanigans so that probably means displayport is indeed better for some things.
 
Last edited:
I don't have that "Set up G-SYNC option at all.

My monitor is a 240hz freesync one.

Is it because of the HDMI? The cable allows me to have 240hz with no problems.

is your monitor on the list of supported devices? if not, you may have to enable the freesync override within the control panel

i THINK displayport has higher bandwidth than HDMI as well. i can only get 3440x1440 @ 50 hz over HDMI, while i can do 3440x1440 @ 120hz over DP. it could just be my screen using an older HDMI standard though (alienware aw3418dw)
 
Last edited:
I don't have that "Set up G-SYNC option at all.

My monitor is a 240hz freesync one.

Is it because of the HDMI? The cable allows me to have 240hz with no problems.

I forgot about this whole thing but can confirm, plugged in the Display Port package and now the option is there, turned on by default. I got this as a cheap used 4k IPS 60hz Freesync monitor. Now its worth a lot more I suppose.
 

nkarafo

Member
i THINK displayport has higher bandwidth than HDMI as well. i can only get 3440x1440 @ 50 hz over HDMI, while i can do 3440x1440 @ 120hz over DP. it could just be my screen using an older HDMI standard though (alienware aw3418dw)
It depends on the HDMI version maybe?

My monitor came with an HDMI cable and it displays 240hz@1080p fine. I assume it's a more recent version of the HDMI technology or something. But it's probably still less capable than displayport.
 
Last edited:

kraspkibble

Permabanned.
i just got a BenQ EW3270U. it's not validated by Nvidia but holy shit it works so well. i can't go back to my old monitor now.
 
Last edited:

kraspkibble

Permabanned.
You mean EW3270U? What bit is that HDR? What gpu do you have?
whoops, yeah...forgot the 3 lol.

it says 10bit but apparently it's not bright enough for HDR10 even though that's what it's advertised as. i'm impressed with the HDR it can do. some games/videos look fantastic with it but some are just too saturated.

i have an RTX 2080 founders edition.
 
Top Bottom