• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

PC VSync and Input Lag Discussion: Magic or Placebo?

So do PS4 and Xbox one have the same kind of input lag as PC has?

Example: Battlefield 4 and Black Ops 3 on PS4; the controls feel very fluid and reactive.

On PC: Vsync is fine with a controller, but you can absolutely feel the difference with vsync off. Yes, you get tearing, but the tearing leads to more kills. HOWEVER; the ps4 versions still feel as responsive as the non-vsynced PC versions. Why? Is this just placebo?
 
Right, so putting the Pre-rendered frames to 1 in nvidias control panel solves A LOT of the lag issue. There still is some lag, but WAY LESS than at the normal 3 pre-rendered frames. VERY playable on shooters like battlefield 4 and even the new doom online. Won't play CS:go this way though. The lag feels comparable to PS4 lag, almost non-existent.

Just letting people know so they can try it for themselves.

I was also impressed by the 59hz + vsync solution; removed a lot of lag, but introduced nasty stutters now and then (no tearing though). My solution keeps the screen buttery smooth.

One thing that won't go away however is the sudden increase in lag when there is a hard drop (like 130 to 80 for a few seconds). Even when you are playing 60fps all the time.

Note: when playing on a controller this stuff doesn't matter that much. I've been playing the black ops 3 beta on pc with Vsync on with a controller and could get kills. Auto aim is your friend.
 
Bump.

Still trying to find an explanation as to why consoles feel like they have less input lag than Vsync on Pc.


Also: I put prerendered frames back to 3 for DOOM. Seems to have less lag that way. Weird.
On BF4 the setting has a HUGE impact on input lag. Even with Vsync off.
 
Since this kind of thing varies on a per-game and pre-engine basis, what are you all doing to get minimal input lag without tearing in Overwatch?

For me the in-game vsync adds tons of input lag, so I keep that off. If I limit the frames to my refresh rate (60) via my Crimson driver settings, there is less input lag than I get with vsync but more than there is when running an uncapped framerate. There's also some noticeable tearing. When running at a mostly stable 100fps there is hardly any input lag at all thanks to the higher input sampling from the game, and tearing is slightly less noticeable because any two frames hitting the screen out of sync are similar enough in content to not show obvious tearing as often.

But there's no solution I've come across that gives me a smooth image with no tearing or input latency, and that's a real shame because the game's art style deserves to be displayed smoothly.
 
Bump.

Still trying to find an explanation as to why consoles feel like they have less input lag than Vsync on October.


Also: I put prerendered frames back to 3 for DOOM. Seems to have less lag that way. Weird.
On BF4 the setting has a HUGE impact on input lag. Even with Vsync off.
Console games all have varying levels of input lag, depending on how the game was designed. See: http://www.displaylag.com/video-game-input-lag-database/
Prerendered frames set to 1 should reduce input lag for most games, though exceptions may exist of course. Fortunately, Nvidia Control Panel allows you to make exceptions pretty easily.
Out of curiosity, what program did you use for framelimiting?
Since this kind of thing varies on a per-game and pre-engine basis, what are you all doing to get minimal input lag without tearing in Overwatch?
From my experience, driver-based framerate limiting tends to not be very good.
I don't have Overwatch, but have you tried ingame VSync + RivaTuner's 60FPS cap?
 
Consoles at 30 fps are way more responsive than pc games at 30 fps+half refresh rate vsync or 30 fps+30hz vsync and they get even better if you plug the console into a low latency monitor. I don't know why though.
Having RTSS cap at your refresh rate + 1 pre rendered frame with vsync makes games more responsive than just normal vsync. Doing all that still doesn't make pc games at 30 fps as responsive as console games though, but they do get just as smooth when using nvidia half refresh rate. 30 fps pc games were just a lost battle for me, nothing would get input lag as low as console. 60 fps works better though.

60 rtss cap + vsync + 1 pre rendered frame is pretty responsive, more than console 60 fps games, but falls apart if you can't maintain the fps (can't stand judder). Using borderless windowed + fps cap without vsync is sometimes more responsive, but you get occasional or constant stutters that way. Don't bother with vsync + 59 or 29 fps cap and stuff, it just leads to nasty judders.
I got tired of the juggle and just got gsync. Lowest input lag, perfectly displayed frames at any fps, it's just amazing.
 
Consoles at 30 fps are way more responsive than pc games at 30 fps+half refresh rate vsync or 30 fps+30hz vsync and they get even better if you plug the console into a low latency monitor. I don't know why though.
Having RTSS cap at your refresh rate + 1 pre rendered frame with vsync makes games more responsive than just normal vsync. Doing all that still doesn't make pc games at 30 fps as responsive as console games though, but they do get just as smooth when using nvidia half refresh rate. 30 fps pc games were just a lost battle for me, nothing would get input lag as low as console. 60 fps works better though.

60 rtss cap + vsync + 1 pre rendered frame is pretty responsive, more than console 60 fps games, but falls apart if you can't maintain the fps (can't stand judder). Using borderless windowed + fps cap without vsync is sometimes more responsive, but you get occasional or constant stutters that way. Don't bother with vsync + 59 or 29 fps cap and stuff, it just leads to nasty judders.
I got tired of the juggle and just got gsync. Lowest input lag, perfectly displayed frames at any fps, it's just amazing.

Wouldn't a cap of 60 just engage normal Vsync with the same amount of input lag? I also use rtss to cap but I haven't felt any differences between capped 60 and Vsync.

Now; I've been playing around with battlefield 4 today. And prerendered frames on 1 plus border less Vsync leads to response that feels like Vsync off in full screen mode. Of course turning Vsync off in this setting is even more responsive, but it is good enough for a smooth experience and still be competitive.

Overwatch ran higher than 130 fps with the open beta here. I limited the frame rate to 120. Since that gives the smoothest feeling with non-intruding tears. Great if you get those frame rates. Sucks when you drop to 90, you will feel the stutter.

I really hope gsync will get cheaper. 500 euros is a bit much for a tn panel without hdmi input.

Or I hope vulkan will provide the same input lag as consoles, since shooters on console feel quite responsive. About that 30fps lock; I think console engines poll the input at 60hz even when driving just 30 frames. That could be why the input lag feels less. Could be wrong though.

End result however is that even with a powerful pc you will get a better experience on console if you don't like tearing and input lag
 
From my experience, driver-based framerate limiting tends to not be very good.
I don't have Overwatch, but have you tried ingame VSync + RivaTuner's 60FPS cap?
Hey, just wanted to thank you for the suggestion. 60fps cap in Rivatuner Statistics Server + in-game vsync with triple buffering seems to offer the best of all worlds. Just as smooth and free of tearing as it was in borderless windowed, but with slightly less input delay. It's also easier on my GPU, which is always nice.
 
Now what I want to know is why DOOM 2016 is not capable of Triple Buffering at all, be it fullscreen of windowed.

I mean it's a pretty basic feature that their ground breaking engine can't even do.
 
Pretty easy to notice with a game like CS:GO

Vsync and Gsync are a big no-no in that game. You'll be getting headshotted before you even react to move your crosshair.
There's input lag even with G-sync in CSGO? I thought G-sync was the solution to that problem. Then again, both the Source engine and CSGO in particular have always been extra sensitive when it comes to input delay.

While I can limit my frames to 60fps and turn on triple-buffered vsync in Overwatch and enjoy a smooth video output with minimal input delay, I'm not able to do the same in CSGO without noticeable latency.
 
Hey, just wanted to thank you for the suggestion. 60fps cap in Rivatuner Statistics Server + in-game vsync with triple buffering seems to offer the best of all worlds. Just as smooth and free of tearing as it was in borderless windowed, but with slightly less input delay. It's also easier on my GPU, which is always nice.
Now what I want to know is why DOOM 2016 is not capable of Triple Buffering at all, be it fullscreen of windowed.

I mean it's a pretty basic feature that their ground breaking engine can't even do.
Chances are the Triple Buffering of both games is the "common" triple buffering and not "proper" triple-buffering. "Common" triple buffering adds an extra buffer to the queue, which helps with framerate drops but theoretically adds input lag. If you're maintaining a capped 60FPS, I'd leave Triple Buffering turned off imo.
"Proper" triple buffering allows for uncapped framerates without screen tearing and with minimal input lag. Pretty much no PC game implements this in-engine, though it can be forced with borderless windowed at the cost of a little input lag. Nvidia's upcoming FastSync will enable proper triple buffering in exclusive fullscreen, and the input lag might be low enough to be usable for CS:GO.

I'll be updating the OP soon to remove some outdated info and add some stuff on GSync, FreeSync, and FastSync.
 
Now what I want to know is why DOOM 2016 is not capable of Triple Buffering at all, be it fullscreen of windowed.

I mean it's a pretty basic feature that their ground breaking engine can't even do.

It seems like most developers don't consider it a basic feature, for whatever reason. I've never seen anyone in the media quiz a developer on it. Probably because developers today are not allowed to give interviews and everything is filtered through the PR dept. It's not something that PR are going to have an answer for (technical subject, minuscule visibility).

I'm sure the engine allows you to implement it, for whatever reason it's not a feature of their OpenGL renderer. Perhaps we'll get some better options in their Vulkan renderer.

If you have a Nvidia GPU you can force it through the driver, although id software recommend you don't for performance reasons. Anecdotally the AMD feature doesn't seem to work on Doom.

Nvidia's fast sync is just driver-level triple buffering for DirectX applications, so that wouldn't help with Doom either.

The fact is that fixed refresh displays are rubbish for complex computer games. Those that max out at 60Hz are even worse and give you very few options. You have no nice solution, it's either tearing, or latency and stutter.

Our best bet for a nice experience across many games is a hardware-based sync solution like VESA DisplayPort Adaptive Sync or Nvidia G-Sync. It just makes no sense to take away control from the GPU and force the render rate to match some arbitrary limit imposed by really old electronics technology inside your display.
 
There's input lag even with G-sync in CSGO? I thought G-sync was the solution to that problem. Then again, both the Source engine and CSGO in particular have always been extra sensitive when it comes to input delay.

While I can limit my frames to 60fps and turn on triple-buffered vsync in Overwatch and enjoy a smooth video output with minimal input delay, I'm not able to do the same in CSGO without noticeable latency.

yes, Gsync adds input lag to any game. It's not that noticeable outside of a game like CS:GO, but it's definitely there.

Overwatch wasn't made to be a competitive shooter and the servers are only 20 tick anyway (compared to CS:GO's 64 tick shitty MM and 128 tick in premium services/pro matches) so basically it's gonna be laggy no matter what you do with your settings. At only 20 refreshes per second, yea I bet triple buffered vsync is the way to go.
 
yes, Gsync adds input lag to any game. It's not that noticeable outside of a game like CS:GO, but it's definitely there.

G-sync doesn't add any input lag outside of the waiting time for a frame to fully finish.

The only time I've seen people think gsync adds input lag is due to the activation of vsync when they didn't cap their FPS properly.

Otherwise we're talking about a millisecond or 2 difference for capped 140 FPS g-sync vs 300 FPS 144hz no sync only because you can potentially see part of an unfinished frame sooner.

I still prefer uncapped FPS with ULMB for FPS but only because of blur reduction. The input lag difference is negligible.

Triple buffered v-sync is terrible for FPS, though.
 
Triple buffered VSync usually introduces unacceptable levels of input lag. Some double buffered engines actually have very low input lag. One of the best examples I can think of is Sanctum 2...it actually felt great even with Vsync on. Source engine games used to be great when double buffered (e.g. portal 2). For some reason though, Portal 2 is still triple buffered even when the double buffered option is selected...maybe it's a driver issue.

The idea that double buffered vsync is common has never been true for me. Double buffered vsync implementations are rare.

After getting a GSync monitor, the biggest improvement had been the ability to play with Vsync with no input lag. Some games like Super Hexagon or Street Fighter 4 were made so much more playable with GSync.

GSync can add very small amounts of input lag if the frame rate is exceedingly high. NVidia's new fast sync may be an interesting solution, and might actually render GSync somewhat pointless for high FPS gaming.
 
Pretty easy to notice with a game like CS:GO

Vsync and Gsync are a big no-no in that game. You'll be getting headshotted before you even react to move your crosshair.

For me, the main reason to avoid using GSync in CS:GO is the lack of ULMB, rather than the nearly unnoticeable amount of input lag. Hopefully fast sync allows us the freedom to have it all: low input lag, ULMB, and no tearing. That's the dream.

My GTX 680 can output something around 200-300fps in CS GO at best, so the input lag from GSync would be around 2-4ms. I'd classify that as fairly negligible.
 
The idea that double buffered vsync is common has never been true for me. Double buffered vsync implementations are rare.
How did you learn that? I know that some games do triple-buffered VSync regardless (e.g. Sleeping Dogs), but I can't tell the difference across different games, and RadeonPro doesn't detect DirectX's Triple Buffering in most of the games I've played.
 
By having 1.66% less load? A 60fps limit already wouldn't have a GPU working at 100%, if the GPU was capable of going over it.
IIRC the quote is referring to the reduced load of capping the framerate compared to having an uncapped framerate, rather than the negligible load difference between 60FPS and 59FPS.
I'll be rephrasing that part of the OP in my update soon.
 
The only time you'll run into any amount of noticeable input lag with G-Sync is when you're hitting the refresh rate limit. In a game like CSGO, you can get around this by using fps_max to ensure you're not constantly hitting up against that refresh rate limit (e.g. fps_max 120 - 140).

It's what I do, I have my fps_max at 138 and leave G-Sync on. That way I get basically no extra input lag over v-sync off and I get the nice, smooth, complete frames of G-Sync.

It's really funny, I forgot about how much G-Sync adds to the overall smoothness of an image even when you're at framerates where tearing isn't really noticeable. Everything just feels even, properly paced and smooth. It's immediately noticeable to me now if I turn G-Sync off for CSGO and I don't like playing without it.

http://www.blurbusters.com/gsync/preview2/

Compare, for example: vsync off to G-Sync fps_max 120. Basically no difference in input lag.
XCEtjcF.png
 
]Consoles at 30 fps are way more responsive than pc games at 30 fps+half refresh rate vsync or 30 fps+30hz vsync and they get even better if you plug the console into a low latency monitor[/B]. I don't know why though.
Having RTSS cap at your refresh rate + 1 pre rendered frame with vsync makes games more responsive than just normal vsync. Doing all that still doesn't make pc games at 30 fps as responsive as console games though, but they do get just as smooth when using nvidia half refresh rate. 30 fps pc games were just a lost battle for me, nothing would get input lag as low as console. 60 fps works better though.

60 rtss cap + vsync + 1 pre rendered frame is pretty responsive, more than console 60 fps games, but falls apart if you can't maintain the fps (can't stand judder). Using borderless windowed + fps cap without vsync is sometimes more responsive, but you get occasional or constant stutters that way. Don't bother with vsync + 59 or 29 fps cap and stuff, it just leads to nasty judders.
I got tired of the juggle and just got gsync. Lowest input lag, perfectly displayed frames at any fps, it's just amazing.

This just isn't true. I've played tons of 30FPS games on consoles that have horrible input lag. Even 60FPS games that have way more input lag than they should. Like BattlEAfront, Earth Defense Force 4.1, Doom 4, Uncharted NDCollection. Doom in particular's input lag is so bad on PS4 it almost feels unplayable to me. And I thought the input lag in the PC version with just a mouse at 60FPS was bad.

With pre-rendered at 1 with 1/2 refresh rate, games feel just as if not more responsive than 30FPS console games. That's just my subjective experience however.
 
It might be just me, but one of the weird things I've noticed today is after turning off vsync in CS:GO after doing a test of sorts in L4D2 coop play yesterday with great results on my 60 Hz monitor (responsive input, ridiculously high frame rates averaging 200+ FPS, lack of distracting tearing), I went ahead and played a deathmatch round on CS:GO, with vsync turned off.

That setting seems to be That One Thing(tm) that makes me go from "always die fist" to "podium finish usually, or close enough". I no longer end up seemingly get shot by people that got there first somewhere despite my network lag of ~100ms - instead, I could actually hit my foes! It made the game feel a lot better since I could actually win.

So I tried a round of competitive CS:GO - it's actually my first time.

Who would have thought that I could actually land kills, get MVPs, and end up having the team win?

CS:GO seems to be one of these games that have a bad lag problem with vsync on...
 
Man this is weird.

When I was on AMD I used to go Borderless Windowed + RTSS frame cap for best possible frame pacing.

Now I'm on Nvidia and I found that Fullscreen + Vsynch gives me better results.

How do guys set up stuff in the Nvidia Control panel for Full Screen gaming, enable Vsynch+enable Tripple Buffering and then disable Vsynch in game?

What about Borderless Fullscreen, does it make a difference if I have VSynch+Tripple Buffering enabled in the Control Panel?
It's weird because I feel like Borderless Windowed should give me better frame pacing than it does right now.
 
So do PS4 and Xbox one have the same kind of input lag as PC has?
No, they generally have more than a (sanely set up) PC. This is confirmed in pretty much all the measurements I've seen.

Are you playing both with the same input device? Because with a mouse you can easily notice input lag which is not noticeable on a dual-analog controller (due to its relative lack of precision and speed).
 
Man this is weird.

When I was on AMD I used to go Borderless Windowed + RTSS frame cap for best possible frame pacing.

Now I'm on Nvidia and I found that Fullscreen + Vsynch gives me better results.

How do guys set up stuff in the Nvidia Control panel for Full Screen gaming, enable Vsynch+enable Tripple Buffering and then disable Vsynch in game?

What about Borderless Fullscreen, does it make a difference if I have VSynch+Tripple Buffering enabled in the Control Panel?
It's weird because I feel like Borderless Windowed should give me better frame pacing than it does right now.

The faster your card is the further ahead it will render. If the DX flip queue is long (game dependent) then with a faster card and vsync you'll have bigger lag. To avoid this you can change the pre-render limit option in NV's drivers to 1 or 2.

TB in the CPL is for OpenGL only, it has no effect on D3D games.

Exclusive fullscreen + driver vsync + pre-render limit set to 1 usually gives the best results on NV h/w. Borderless is game dependent -- sometimes it works fine, sometimes it's a lag and stutter fest.
 
The faster your card is the further ahead it will render. If the DX flip queue is long (game dependent) then with a faster card and vsync you'll have bigger lag. To avoid this you can change the pre-render limit option in NV's drivers to 1 or 2.

TB in the CPL is for OpenGL only, it has no effect on D3D games.

Exclusive fullscreen + driver vsync + pre-render limit set to 1 usually gives the best results on NV h/w. Borderless is game dependent -- sometimes it works fine, sometimes it's a lag and stutter fest.

Interesting but doesn't Vsynch on automatically mean Pre rendered frames = 0?

What effect does changing the pre rendered frame limit have if I have Vsynch turned on?
 
Interesting but doesn't Vsynch on automatically mean Pre rendered frames = 0?

What effect does changing the pre rendered frame limit have if I have Vsynch turned on?

On the contrary, with vsync on you force the flip queue to always be shown synced with display refresh in order it was rendered by the videocard. So until the whole pre-rendered queue is shown to you in order, the system can't show the frame which register your input, and thus you have the lag of the length of the pre-rendered frames queue - which is application dependent but is usually 2 (meaning that you have 1 frame shown, 2 frames ready to be shown in the queue and 1 frame which is being worked on right now).

Without vsync the flip queue is being shown as fast as the next frame is ready to be shown so without vsync the pre-render limit means nothing basically. You can test it yourself but last time I checked this without vsync there was no observable lag difference with any number set in the pre-render limit option.

Pre-render queue acts a bit like triple buffering in the sense that you have a couple of frames ready beyond the one which is being shown right now which allows the system to present these frames while the GPU is working on the next one. It smooths out the framerate when vsync is on and gives less chance of frame drops but at the cost of additional input lag. If your system is fast enough to provide a steady stream of 16.6ms frames in a game then setting pre-render limit to 1 and vsync on is usually a better option. If it's on the borderline with some frames dipping above 16.6ms then it may be better to use the default settings or set the pre-render limit to 2-3 manually.

Although I personally just use adaptive vsync with pre-render limit set to 1 in such cases as I very much prefer lower input lag to lack of tearing.
 
On the contrary, with vsync on you force the flip queue to always be shown synced with display refresh in order it was rendered by the videocard. So until the whole pre-rendered queue is shown to you in order, the system can't show the frame which register your input, and thus you have the lag of the length of the pre-rendered frames queue - which is application dependent but is usually 2 (meaning that you have 1 frame shown, 2 frames ready to be shown in the queue and 1 frame which is being worked on right now).

Without vsync the flip queue is being shown as fast as the next frame is ready to be shown so without vsync the pre-render limit means nothing basically. You can test it yourself but last time I checked this without vsync there was no observable lag difference with any number set in the pre-render limit option.

Pre-render queue acts a bit like triple buffering in the sense that you have a couple of frames ready beyond the one which is being shown right now which allows the system to present these frames while the GPU is working on the next one. It smooths out the framerate when vsync is on and gives less chance of frame drops but at the cost of additional input lag. If your system is fast enough to provide a steady stream of 16.6ms frames in a game then setting pre-render limit to 1 and vsync on is usually a better option. If it's on the borderline with some frames dipping above 16.6ms then it may be better to use the default settings or set the pre-render limit to 2-3 manually.

Although I personally just use adaptive vsync with pre-render limit set to 1 in such cases as I very much prefer lower input lag to lack of tearing.

Hey man thx for all the input. So right now my 980ti is pretty much tearing through stuff so I'll try driver Vsynch+Pre Rendered frames = 1.

My CPU might actually become somewhat of a bottleneck in the future, does this have any effect on which settings to choose?
 
I feel like NVIDIA'S triple buffering and prerendered frames settings are only working properly as of recently.

For the hell of it, I turned GSYNC off on my monitor to see what the input lag was without it, and was surprised that the lag was very minimal and almost non existent in both Overwatch and CS 1.6. Granted, I'm playing at 144Hz, but in the past, it always felt like there were several frames of lag regardless of my driver settings. Now, it actually feels like there's 1 frame of lag at most.

Is FastSync actually working yet? That could explain part of it.
 
Hey man thx for all the input. So right now my 980ti is pretty much tearing through stuff so I'll try driver Vsynch+Pre Rendered frames = 1.

My CPU might actually become somewhat of a bottleneck in the future, does this have any effect on which settings to choose?

With CPU being the limiting factor there's actually less back pressure on the flip queue (since the GPU can't render the frames ahead if they are still being worked on by the CPU) and funnily enough you may get less input lag because of this even with the default pre-render limit setting. Give it a try, it's rather game to game dependent, some are ok with default settings while some show a clear input lag reduction when setting the pre-render limit to 1.
 
I use vsync in every game and never notice lag but I'm not a competitive online gamer. I do alright, but don't take it very seriously.

I also came from console gaming and have my PC connected to a TV so maybe I'm incapable of noticing regular input delay.
 
I tend to dislike tearing vastly more than the input lag on games so I alway run with VSync on. I don't often play very twitch based games that require precise input and have been using a controller more often than not so it really doesn't bother me much anymore. If I was playing a lot of FPS with M/KB I might have a different stance.
 
Thought I'd quickly report back on this. What I tried:

Mirrors Edge Catalyst, Hyper settings,1080p, Fullscreen, ingame Vsynch disabled, driver Vsynch enabled, RTSS 60fps cap: Basically the lower the prerendered frames number, the better the framepacing. It's never quite perfect though although that might be because my CPU hovers around and above 90ish% usage, GPU usage is around 75% at the most.

With prerendered frames = 1 frametimes are fluctuation between 16,66 and around 20 ms, so I might try lowering settings to se what's causing the most stress on my CPU.

The Witcher 3, 4K, mostly Ultra Settings except shadows and Foliage Distance (I figured this would stress my CPU too much), 30fps RTSS cap, rest as above: Frametimes are staggeringly perfect, as in a solid flat line at 33,33ms. It's beautiful! With framepacing like this I take zero issues with 30fps.
 
Top Bottom