• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Gsync is amazing!

daninthemix

Member
Yep, I'm definitely going g-sync when these stars align:

- single-card 4k performance becomes feasible (i.e. 60ish fps @ high settings @ 4k on modern titles on a single card)
- a 4k g-sync monitor becomes available that is high quality (i.e. 8-bit panel, not TN) and doesn't cost the earth.
 

JustinSane

Neo Member
So they don't build it into the cards, have a separate device and special monitors with support for it?

Some PC gamers really are played as fools eh?
 

riflen

Member
So they don't build it into the cards, have a separate device and special monitors with support for it?

Some PC gamers really are played as fools eh?

If you're completely ignorant of the technology and what's being achieved, it could seem that way I suppose.
 

Dreathlock

Member
I can live with that. It's still an improvement over 60hz and 75 fps looks better to me than 60 fps on my current 144hz monitor. I'm assuming 75 fps on a 144hz monitor looks the same as 75fps on a 75hz.

Is this true? So when i am only gaming in the 50-90 range i wont have full benefit of a 144hz monitor?
 

th4tguy

Member
Gsync monitors are still way overpriced


Wow glad I checked before posting. Gsync tried to auto correct to "gay community" lol.
 

Kinthalis

Banned
Most of the games he mentioned perform poorly with vsync on. Unity and batman dx11 specially knock off a lot of fps when vsync is enabled. Vsync is not free in terms of performance.

With gsync vsync in game is disabled And he would see higher frame rates from that. Probably not double the frame rate (except in the case of arkham city which implements double buffered vsync yuck). Plus the smooth fact9r of gsync woild certainly make the game feel more reaponsive.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Query: I've been told that g-sync monitors are really only worth it if you're floating somewhere about 50fps with frequent fluctuations, hence 144hz monitors, as the dynamic refresh rate goes well with stabilising each frame at those framerates. And that despite this, they're actually kinda shitty if you're hovering between 30 - 60fps (so lets say averaging 45fps), even though I would have though they'd be just as advantageous, if not moreso for stabilising frame rendering at sub-60fps.

True/false/butts?
 

Damian.

Banned
Nvidia is hiring 5 year olds to do tech reports now? :p

We know what Gsync does, but your rig must have been messed up pretty bad beforehand if Gsync is doing what you say it is doing. Congrats on fixing your issues along with adding Gsync.
 

b0bbyJ03

Member
I can't wait for this tech to come to TVs. I need this but I'm not willing to game on anything other than my plasma. love the quality too much.
 

Marcone

Neo Member
I dont understand why people are so down on TN panels. I have a Dell U2713HM (1440p ips) and an Acer XB280HK (4K TN) and there is barely any difference between the two other than the resolution.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
I dont understand why people are so down on TN panels. I have a Dell U2713HM (1440p ips) and an Acer XB280HK (4K TN) and there is barely any difference between the two other than the resolution.


If you're seeing "barely any difference" between a TN panel and an IPS panel then either your IPS panel is broken / very poor or you need new glasses. The difference in IQ is huge between the two IMO.
 

knitoe

Member
I dont understand why people are so down on TN panels. I have a Dell U2713HM (1440p ips) and an Acer XB280HK (4K TN) and there is barely any difference between the two other than the resolution.
I also have both monitors and there are noticeable differences, but high quality TN panels these days are good enough that I can put the Dell away. For gaming, they are fine. If I need accurate colors for working, probably, not. TN panels are no where as bad as the earlier ones. Think, people that are still down on them probably had bad experiences with the earlier ones and/or still regurgitating old information.
 

herod

Member
yeah, there is just naturally a larger quality spread between good and bad TN panels because they're the cheaper technology generally.
 

tuxfool

Banned
I also have both monitors and there are noticeable differences, but high quality TN panels these days are good enough that I can put the Dell away. For gaming, they are fine. If I need accurate colors for working, probably, not. TN panels are no where as bad as the earlier ones. Think, people that are still down on them probably had bad experiences with the earlier ones and/or still regurgitating old information.

I have two panels of the same size one is IPS and another TN. I can easily spot the difference between the two side by side. But on its own I have no real problems with the TN.
 

knitoe

Member
But for people spending tons of money on gaming rigs that enable them to max out visual fidelity, to then arrive at "fine" with the monitor itself seems like a terrific anticlimax.
Until the recent gsync IPS, that's the choice to have to make if you want fast refresh rate + gsync. For gaming, that's a totally fine comprise.
I have two panels of the same size one is IPS and another TN. I can easily spot the difference between the two side by side. But on its own I have no real problems with the TN.
So, you agree with me.
 

hlhbk

Member
Nvidia is hiring 5 year olds to do tech reports now? :p

We know what Gsync does, but your rig must have been messed up pretty bad beforehand if Gsync is doing what you say it is doing. Congrats on fixing your issues along with adding Gsync.

I didn't change a thing on my rig at all. Just adding the monitor, setting it to 144Hz, and turning gsync on have me the performance increase I described in the OP.
 

tuxfool

Banned
Until the recent gsync IPS, that's the choice to have to make if you want fast refresh rate + gsync. For gaming, that's a totally fine comprise.

So, you agree with me.

I'm pointing out that the difference is significant, but not enough to make using a TN panel terrible.
 

Hazaro

relies on auto-aim
I didn't change a thing on my rig at all. Just adding the monitor, setting it to 144Hz, and turning gsync on have me the performance increase I described in the OP.
You realize that is impossible right?

Something else changed.
 

Serandur

Member
I'd really like to spring for Gsync, but I don't want to pay such a premium only to be locked in to Nvidia GPUs for the monitor's lifespan (I know; practically, not literally). I mostly stick to Nvidia anyway, but the concept just creates a mental barrier I find unpleasant to cross.

That XB270HU looks amazing.


Edit: The FPS improving thing though is... unlikely.
 

Exile20

Member
I didn't change a thing on my rig at all. Just adding the monitor, setting it to 144Hz, and turning gsync on have me the performance increase I described in the OP.

Doesn't make sense unless you are running at 720p or something. It does not increase frame rate.

Especially from 40 fps to 60 fps?
 

hlhbk

Member
Doesn't make sense unless you are running at 720p or something. It does not increase frame rate.

Especially from 40 fps to 60 fps?

Resolution hasn't changed. 1080p before 1080p now. I tried it again with my old monitor, turning vsync back on and saw the exact same performance decreases I described in the op. I didn't change anything else.
 

Kinthalis

Banned
Doesn't make sense unless you are running at 720p or something. It does not increase frame rate.

Especially from 40 fps to 60 fps?

It does, if he's playign games that utilized double buffered Vsync, and have poor vscyn implementation.

Double buffered Vsync means anytime he dipped below 60 FPS he would go down to 30. Batman Arkham City in DX11, for example. I had to turn on Adaptive Vsync for that game. With Gsync , suddenly frame rate drops below 60 aren't an issue.

Other games, like most Assasin Creed games have a high performance penalty with Vscyn on.

Try running the game with Vsync on, then turn Vscyn off. I guarantee you'll see not only smoother frame times, but a higher frame rate (not a lot higher though). Of course you'll also get frame tearing up the ass, depending on your panel and frame rate.
 

Blitzhex

Member
Is this true? So when i am only gaming in the 50-90 range i wont have full benefit of a 144hz monitor?

Yeah, from my experience the full benefit of 120/144hz is only realized with fps equivalent or close to the refresh rate, try it out in an easy to run game.
 

Durante

Member
Query: I've been told that g-sync monitors are really only worth it if you're floating somewhere about 50fps with frequent fluctuations, hence 144hz monitors, as the dynamic refresh rate goes well with stabilising each frame at those framerates. And that despite this, they're actually kinda shitty if you're hovering between 30 - 60fps (so lets say averaging 45fps), even though I would have though they'd be just as advantageous, if not moreso for stabilising frame rendering at sub-60fps.

True/false/butts?
Butts. G-sync is advantageous at all framerates, and something like 45 FPS average is actually one of the cases where the difference compared to fixed refresh is most pronounced.

I guess not everyone is aware of the big money making machine.
While you, of course, are aware of the secret techniques necessary to introduce a new way to drive monitors by only changing the GPU side of the communication.
 
G-Sync is indeed amazing. It doesn't actually improve performance but with the way everything feels and looks smoother at lower and more variable framerates it might make you think it is.
 

Dr. Kaos

Banned
Impossible results are impossible.

Have you tested Gsync vs Vsync disabled ?

I'm waiting for the next-gen AMD cards, which are going to be the first big jump in performance in YEARS. So I'm stuck with freesync, which depends more on the monitor manufacturers than Gsync, since nVidia supplies the hardware specs for the Gsync module.

ps: when I say "big jump", I mean 2x performance or above. the 390X will apparently have triple the memory bandwidth of GDDR5.
 
Yep, I'm definitely going g-sync when these stars align:

- single-card 4k performance becomes feasible (i.e. 60ish fps @ high settings @ 4k on modern titles on a single card)
- a 4k g-sync monitor becomes available that is high quality (i.e. 8-bit panel, not TN) and doesn't cost the earth.

100% with you on this.

i would have gone g-sync on my current 1440p set-up if there was an IPS variant that could be used reliably for photography work as well as gaming [like my current Dell model], but that's still asking too much.

soon my friends, soon.
 

Durante

Member
100% with you on this.

i would have gone g-sync on my current 1440p set-up if there was an IPS variant that could be used reliably for photography work as well as gaming
[like my current Dell model], but that's still asking too much.

soon my friends, soon.
I guess you bought before this was available?
 

Grief.exe

Member
Query: I've been told that g-sync monitors are really only worth it if you're floating somewhere about 50fps with frequent fluctuations, hence 144hz monitors, as the dynamic refresh rate goes well with stabilising each frame at those framerates. And that despite this, they're actually kinda shitty if you're hovering between 30 - 60fps (so lets say averaging 45fps), even though I would have though they'd be just as advantageous, if not moreso for stabilising frame rendering at sub-60fps.

True/false/butts?

If I remember correctly, when you are below 60 FPS, the module inserts frames to compensate.

I would imagine being above 60 is the best value for the monitor as you get all the advantages along with no vsync. Gsync creates a situation where dipping below 60 is acceptable.
 

Unai

Member
If I remember correctly, when you are below 60 FPS, the module inserts frames to compensate.

I would imagine being above 60 is the best value for the monitor as you get all the advantages along with no vsync.

Nope. This happens when it's below 30 with G-Sync. I believe Freesync is 40 FPS or so.

Edit: That is if I undstood what you said. Are you talking about, for instance, setting the monitor at 80Hz when running in 40 FPS? That's what I meant to say that happens when it's below 30 FPS.
 

Kinthalis

Banned
If I remember correctly, when you are below 60 FPS, the module inserts frames to compensate.

I would imagine being above 60 is the best value for the monitor as you get all the advantages along with no vsync. Gsync creates a situation where dipping below 60 is acceptable.

No. If it goes below 30 the screen will refresh at 30 hz, just like a normal 30hz monitor would do (IIRC). Anything 30 and above will refresh as soon as the frame is dished up.
 

HTupolev

Member
Having vsync off, using triple-buffered vsync, and using gsync all let the GPU spit out frames as fast as possible, and there shouldn't be a framerate difference between them (with the exception of oddities like tearing framerates going above the refresh rate, or weird behaviors with adaptive refresh at ultra-low framerates).

Gsync should offer better framerates than double-buffered vsync, though. (That's basically the advantage that it has over double-buffered; a rock-solid double-buffered vsync should feel basically the same as gsync at the same framerate, but gsync allows for intermediate framerates.)
 

Kinthalis

Banned
When you are below 30 the monitor interpolates frames, and when you are greater than 30 it just adjusts the refresh rate?

Yes, or more precisely, it only refreshes the display when the GPU tells it it's got a new frame ready.

That's the whole point of Gsync. Normally the display refreshes at 60 times per second at all times, period. With Gsync, the panel does the logical, common sense thing god intended. It WAITS for the GPu to give it a new frame, and THEN refreshes. The GPU drives the monitor, rather than the GPU having to try and sync with the monitor.
 

Durante

Member
No. If it goes below 30 the screen will refresh at 30 hz, just like a normal 30hz monitor would do (IIRC). Anything 30 and above will refresh as soon as the frame is dished up.
That's how I thought it worked too, but recently it was shown that below 30 it actually multiplies the current framerate to get a >30 refresh. Eg. if you are at 28 FPS it will refresh at 56 Hz. (this is superior behavior)
 

Kinthalis

Banned
That's how I thought it worked too, but recently it was shown that below 30 it actually multiplies the current framerate to get a >30 refresh. Eg. if you are at 28 FPS it will refresh at 56 Hz. (this is superior behavior)

Cool... would this introduce latency though? Though at 28 FPS maybe you wouldn't notice...
 
I guess you bought before this was available?

yeah, bought March of 2014.

this monitor looks great though! this gives me much hope for my next purchase!

edit: okay wow i'm reading through this review and this thing sounds amazing - i might have to buy it sooner, rather than later!

edit 2: thanks Durante!
 

Unai

Member
That's how I thought it worked too, but recently it was shown that below 30 it actually multiplies the current framerate to get a >30 refresh. Eg. if you are at 28 FPS it will refresh at 56 Hz. (this is superior behavior)

Now that you said that, I think they introduced this behavior with a driver update.

Cool... would this introduce latency though? Though at 28 FPS maybe you wouldn't notice...

Probably less latency than when the refresh rate was fixed at 30Hz while running at 28 FPS.
 

Arulan

Member
Cool... would this introduce latency though? Though at 28 FPS maybe you wouldn't notice...

I don't see why it would. It's similar to running half-refresh Vsync (60Hz) on a 120Hz display, but minus the input latency associated with Vsync. If anything it might be less.
 
Query: I've been told that g-sync monitors are really only worth it if you're floating somewhere about 50fps with frequent fluctuations, hence 144hz monitors, as the dynamic refresh rate goes well with stabilising each frame at those framerates. And that despite this, they're actually kinda shitty if you're hovering between 30 - 60fps (so lets say averaging 45fps), even though I would have though they'd be just as advantageous, if not moreso for stabilising frame rendering at sub-60fps.

True/false/butts?
Very low fps still feels bad with variable refresh, but now 50 fps doesn't fell much different from 60.
The higher the frame rate, the less visible are tearing artifacts, so the benefits of VRR diminish with increasing frame rate.
But VRR still helps to retain smoothness during random frame rate drops like in CPU limited situations or driver lags (i.e. shader recompilation)

But take that with a grain of salt, because while I'm familiar with a ROG swift, I don't own a VRR monitor myself.
That's how I thought it worked too, but recently it was shown that below 30 it actually multiplies the current framerate to get a >30 refresh. Eg. if you are at 28 FPS it will refresh at 56 Hz. (this is superior behavior)
FreeSync does this too. PcPer is wrong.
The difference between FreeSync and Gsysnc is what Gsync can do to prevent the display refresh colliding with GPU frame completion.

Some reports have suggested that when the frame-to-frame interval on a FreeSync display grows too long, the display responds by "locking" into a 40Hz refresh rate, essentially quantizing updates at multiples of 25 ms. Doing so would be pretty poor behavior, because quantization at 25 ms steps would mean horribly janky animation. You'd be making the worst of an already bad situation where the attached PC was running up against its own performance limitations. However, such talk is kind of nonsense on the face of it, since we're dealing with a variable-refresh display working in concert with a GPU that's producing frames at an irregular rate. What happens in such cases differs between FreeSync and G-Sync, but neither solution's behavior is terribly problematic.

Let's start with how G-Sync handles it. I talked with Nvidia's Tom Petersen about this question, since he's made some public comments on this matter that I wanted to understand.

Such talk is kind of nonsense on the face of it, since we're dealing with a variable-refresh display working in concert with a GPU that's producing frames at an irregular rate.
Petersen explained that sorting out the timing of a variable-refresh scheme can be daunting when the wait for a new frame from the graphics card exceeds the display's maximum wait time. The obvious thing to do is to refresh the display again with a copy of the last frame. Trouble is, the very act of painting the screen takes some time, and it's quite possible the GPU have a new frame ready while the refresh is taking place. If that happens, you have a collision, with two frames contending for the same resource.

Nvidia has built some logic into its G-Sync control module that attempts to avoid such collisions. This logic uses a moving average of the past couple of GPU frame times in order to estimate what the current GPU frame-to-frame interval is likely to be. If the estimated interval is expected to exceed the display's max refresh time, the G-Sync module will preemptively refresh the display part way through the wait, rather than letting the LCD reach the point where it must be refreshed immediately.

This preemptive refresh "recharges" the LCD panel and extends its ability to wait for the next GPU frame. If the next frame arrives in about the same time as the last one, then this "early" refresh should pay off by preventing a collision between a new frame and a gotta-have-it-now refresh.

More info: https://techreport.com/review/28073/benq-xl2730z-freesync-monitor-reviewed/3
 
Top Bottom