• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC VSync and Input Lag Discussion: Magic or Placebo?

shockdude

Member
Update 6/27/16: WIP. Added brief info, need citations.

What’s this thread about?

I’d like to hear about your experiences with input lag and VSync, what you know and don’t know, and ultimately make conclusions about various setups and their input latencies.
Traditionally, there has always been a tradeoff between image fluidity and input latency. You either have zero input latency with image tearing & frame judder, or a perfectly fluid 60FPS image with noticeable input latency. I’m a guy that prioritizes low input latency in games, but I don’t want to experience screen tearing if I don’t have to.
There are a lot of tricks floating around the internet to try to get both a perfect 60FPS with low input latency, from playing the game windowed to limiting the framerate. I like to think that I can perceive input lag, but sometimes it’s hard to tell if these tricks work or do anything – hence, “Magic or Placebo.”

Brief summary of VSync and input latency:
Thanks Arulan, Durante, and HTupolev for the additional input and clarifications.

A lot of input latency is based on hardware factors outside of your control, such as the quality of your keyboard and your monitor. Everything else is affected by software, and one major source of input latency is VSync.
Here's a brief overview of the various VSync options, assuming an ordinary 60Hz monitor.
  • No VSync: Rendered image is sent directly to the monitor at whatever framerate the GPU can muster. Typically has the least amount of input latency and is suitable for competitive gaming. The higher the framerate the lower the latency. Usually results in screen tearing and minor juddering due to the framerate not syncing with the 60Hz refresh rate.
  • VSync: More specifically known as double-buffered VSync. Rendered image goes through a buffer before going to the monitor. When implemented properly, eliminates screen tearing and syncs the framerate with the monitor refresh rate for perfect 60Hz smoothness. Usually causes input lag (1-3 frames, depending on the game). May not respond well to framerate dips (e.g. jittering/uneven frame timing, drop to 30FPS instead of to 40-55FPS).
  • Triple-buffered VSync (common): Implemented by games and external programs. Similar to VSync except with another buffer (more VRAM). Can result in a higher average framerate compared to double-buffered VSync. Input latency may increase or decrease compared to double-buffered VSync, depending on your game's performance. Many games do not have this option. Read the rest of the first page for more info.
  • Triple-buffered VSync (proper): Implemented by Windows' Desktop Window Manager (e.g. Vista/7/8), and is used when playing games in a window (borderless fullscreen). Low latency, but not as low as No VSync. Usually fixes tearing but doesn't fix jitter/uneven frame timing.
This article does a good job of explaining VSync and Triple-Buffering in more depth: http://www.anandtech.com/show/2794
Here's a post by HTupolev which also elaborates on the above: http://www.neogaf.com/forum/showthread.php?p=150371456#post150371456

The following are some tricks I've found around the internet which attempt to combine both low input lag and a VSynced image, or at least a close-enough compromise.

Forcing triple-buffering (common) using external programs

Triple-buffering can reduce variation in input latency from framerate fluctuations compared to double-buffered VSync. It may or may not reduce input latency. Your mileage may vary.
For Nvidia and AMD cards, Triple-Buffering for OpenGL games can be forced using the GPU’s respective control panel.
For Direct3D games, forcing triple-buffering generally requires an external application. The only application I know that can do this is RadeonPro, which is pretty easy to use and works with both D3D9 and D3D11 games, 32-bit and 64-bit. Despite the name, RadeonPro will work with Nvidia and Intel GPUs in addition to AMD GPUs; my laptop has both an Nvidia and Intel GPU.
RadeonPro has the side-effect of causing some games to hang upon exit or stay running in the background, requiring them to be killed with the Task Manager. Also it doesn’t work for all games; I hear Watch Dogs doesn’t work well with it.
There's also D3DOverrider, which used to be awesome but is no longer supported and generally a hassle to set up on modern computers.

Windowed gaming/borderless fullscreen

One of the benefits of borderless windowed fullscreen, in addition to instantaneous alt-tabbing, is that it enforces Triple Buffering (proper) on Windows Vista/7/8 at least. However, it also causes reduced performance for some games, and is incompatible with some configurations like SLI. Also, it still experiences the same jitter/uneven frame timing issues as No-VSync, unless you have a good external frame limiter e.g. Dxtory, RivaTuner, etc.
Something I’d like to know but have been unable to conclusively test is the input latency difference between windowed and fullscreen for various VSync configurations. The only conclusion I have is that fullscreen No-VSync has less input lag than windowed No-VSync. Can someone more experienced with borderless fullscreen chime in on this?

Adaptive VSync

This will enable VSync when the FPS is at 60, and will disable it during FPS dips, as an alternative to experiencing a larger FPS drop with VSync still enabled.
Implementation of adaptive VSync varies. RadeonPro, MSI Afterburner, etc. implement it, and I think Nvidia implements it for OpenGL games? Not sure about AMD.

Reducing maximum number of pre-rendered frames

This is something that can be adjusted in the Nvidia Control Panel, and maybe AMD's equivalent. Fewer pre-rendered frames will result in less input-lag, but may cause more severe framerate drops if you can't keep 60FPS. If you can maintain 60FPS and you want less input lag, you should make the number of pre-rendered frames as low as possible.

Nvidia FAST Sync

TODO

Nvidia decided to finally create an implementation of "proper" Triple Buffering in exclusive fullscreen mode, and they called it FAST Sync. Coming out soon. Guaranteed low latency, no screentearing, but if it behaves anything like borderless windowed fullscreen then there will be imperfect framepacing and judder even with a framelimiter.

More info in the FAST Sync thread.

Limiting the framerate to equal the refresh rate (e.g. 60FPS @ 60Hz)

TODO

basically it's magic. Enable VSync, use RivaTuner to limit the refresh rate to equal your refresh rate.

Very low latency.
No tearing.
Perfect frametiming.

Possible framerate hitches if the game engine doesn't play nice with the framelimiter.

Limiting the framerate to the refresh rate minus 1 (e.g. 59FPS @ 60Hz)

TODO. Similar to the above.

Guaranteed low latency.
No tearing.

Judder due to framerate not equaling the refresh rate.
Unsuitable for certain games (fighting games, shmups, etc.)

Nvidia GSync and AMD FreeSync

TODO

I don't have GSync, but I hear it's awesome. Set a framelimit to a couple frames below your framerate (e.g. 140FPS @ 144Hz) and never worry about latency again. FreeSync should be similar.

Other resources

TODO

Here's a decent article from AnandTech where they test various input lag situations with a high-speed camera (ty riflen).
 

Dr. Kaos

Banned
I share your curiosity in these matters, but I have no light to shed upon them.

Perhaps someone like Durante will pop in.
 
It depends on the game honestly. Most games I've used VSync in, I can't tell the difference in input lag. The one big exception is Counter Strike Global Offensive. Though, adaptive vsync (through the nvidia driver) solved that problem.

I am using Gsync now. No input lag or screen tearing issues at all for me. Though, I can still notice dramatic frame drops (120 to like 70 for example).
 

shockdude

Member
It depends on the game honestly. Most games I've used VSync in, I can't tell the difference in input lag. The one big exception is Counter Strike Global Offensive. Though, adaptive vsync (through the nvidia driver) solved that problem.

I am using Gsync now. No input lag or screen tearing issues at all for me. Though, I can still notice dramatic frame drops (120 to like 70 for example).

Input lag is one of those things where you actually need to put effort into noticing it, but once you notice it you can't stop noticing it. I find I'm much more tolerant of input lag when using a controller vs. when using mouse & keyboard.

GSync has no input lag? Nice.
Out of curiosity, would you be able to test any notable visual or latency differences of GSync 120Hz vs. No-VSync 120Hz, or even GSync 60Hz vs No-VSync 60Hz? If you can't notice any difference (besides tearing, obviously), that's ok.
 

GavinUK86

Member
Not sure I could go into any details but turning Vsync on in 99% of games produces input lag I can feel. It's either screen tearing or lag.
 
Not sure I could go into any details but turning Vsync on in 99% of games produces input lag I can feel. It's either screen tearing or lag.

Same here. I absolutely hate input lag so I've had to settle with screen tearing all these years. Can't wait for something like Gsync to be the standard.
 
I'm someone who has very little tolerance for input lag, but more often than not, I choose to play with vsync on. Vsync makes things look way smoother, 70 FPS on a 60hz monitor just seems so...choppy. It makes it look like LESS than 60 FPS.

So unless there's truly awful input lag, I have vsync on.
 
Input lag sucks but fuck screen tearing. I will deal with the worse lag as long as the screen doesn't tear. I can't play wolfenstein due to it tearing like shit.
 
Its actually really weird for me, because I can feel VSync almost as readily as I can see 60fps. Like, I play with inverted aim, and all the time Ill sit down at someone's PC to help them through a spot... And I notice the weird VSync feeling before I notice the aim is upside down.

Unfortunately, I still dont have good reflexes or actually play well at all. :X But I can feel VSync being on!
 
I don't think I ever noticed input lag unless it was obviously really bad (like half a second delay or something). Otherwise I seem to be oblivious to it, which I guess makes me lucky? Then again I never put any conscious effort to sense or measure it in any games, and I never play anything competitive and I'm generally not a high skilled gamer, so those might be reasons why I haven't noticed it yet.

I definitely notice and hate tearing though (so I always use Vsync), and I love 60fps and won't settle for less on PC.
 
Frame limiting should be used with vsync off and be 1 fps lower than your refresh rate.

- same input lag as no vsync
- it eliminates judder
- shouldn't screen tear
- doesn't force video cards to 100% all the time (sometimes quieter)

Some cons?

In Some games since your limited in frame rate it's hard to make certain jumps, etc. It depends on the engine but Source is a good example with Counter Strike.

With my old crossfire setup this was the only way to play FarCry 3, and quite a few other games because of the hair pulling judder.

Dtoxy or something, afterburner, etc. offer frame limiting.
 
Screen tearing can occur regardless of whether you are over or under your refresh rate.

Triple buffering is not created equal. OpenGL basically does not increase input latency with TB but DirectX does increase latency.
 

shockdude

Member
Ty AngryNapkin for the info. Added to OP.
Some more questions: does 59FPS @ 60Hz introduce a "duplicate frame" every second? What happens when you limit to 60FPS instead of 59?

Not sure I could go into any details but turning Vsync on in 99% of games produces input lag I can feel. It's either screen tearing or lag.

Same here. I absolutely hate input lag so I've had to settle with screen tearing all these years. Can't wait for something like Gsync to be the standard.

I think you should try the RadeonPro "lock framerate to refresh rate" trick. It's fairly easy to set up and very easy to verify. Also I'm simply curious as to whether the trick actually works on other computers and not just mine.

Screen tearing can occur regardless of whether you are over or under your refresh rate.

Triple buffering is not created equal. OpenGL basically does not increase input latency with TB but DirectX does increase latency.
The screen tearing regardless of refresh rate is what I experience. I see screen tearing even when the game is limited to 60FPS @ 60Hz.
As for OpenGL triple-buffering having no input lag, can you confirm? Last I checked OpenGL triple-buffering did add a little input lag to Worms Revolution, though that could just be a fluke.
 
Screen tearing can occur regardless of whether you are over or under your refresh rate.

Triple buffering is not created equal. OpenGL basically does not increase input latency with TB but DirectX does increase latency.

In my experience it either got rid of it or it was still there depending on the game. It's only based on my experiences. Sometimes the only way to get rid of tearing is vsync and when I was competing in CAL I'd never use vsync.

I can't get into the technical reasons so it's just my thoughts and what worked for me.
 

HTupolev

Member
Here's a brief overview of the various VSync options, assuming an ordinary 60Hz monitor.
[*]No VSync: Rendered image is sent directly to the monitor. The higher the framerate the lower the latency. Typically has the least amount of latency and is suitable for competitive gaming. Will likely exhibit screen tearing and minor juddering due to the framerate not syncing with the 60Hz refresh rate.
Yes, with vsync off, a frame is immediately made into the "current outputting" frame when it finishes rendering, even if another frame is currently being output. When this occurs, the top part of the displayed image is from an earlier frame, and the bottom part is from the recent frame, thus creating a screen tear.

[*]VSync: Rendered image goes through some buffers before going to the monitor. When implemented properly, eliminates screen tearing and syncs the framerate with the monitor refresh rate for perfect smoothness. Causes the most input lag (2-3 frames?).
Standard double-buffered vsync works where you have one frame being output (the frontbuffer) and one frame being rendered (the backbuffer). When a video refresh occurs, if the backbuffer has finished rendering, it becomes the front buffer. The former front buffer then becomes the backbuffer for a new frame whose rendering gets kicked off. (If a refresh occurs before the backbuffer has finished rendering, you just output the frontbuffer again and let the backbuffer keep rendering).

Double buffered vsync has the characteristic that it will actually stall the GPU when the backbuffer finishes rendering, until a refresh happens. This means that your average framerate tends to be lower, although you also get extremely consistent timing as long as you're maintaining your framerate target.

It also has the impact that any consistent framerate must be an integer fraction of your refresh rate. So, if you're maintaining 30ms frames and your video refresh is 60Hz, your framerate will be exactly 30fps (60/2), but if you suddenly start maintaining 34ms frames, your framerate will drop all the way to 20fps (60/3). Personally I sometimes like this for games that are 30fps with occasional spikes, although it can get nasty with a 60fps target (a drop from 60 to 30 just plain feels bad).

[*]Triple-buffered VSync: Same as VSync except with more buffers. Input lag reduced but not eliminated (1 frame?). Many games do not have this option.
[/list]This article does a good job of explaining VSync and Triple-Buffering in more depth: http://www.anandtech.com/show/2794
Triple-buffering works like double-buffering, except that instead of stalling the GPU when a backbuffer finishes, it immediately starts rendering into another backbuffer (there are two backbuffers). Whenever a video refresh happens, the most recently-finished frame is made into the new frontbuffer (if the most recently-finished frame is still the frontbuffer, then the frontbuffer is output again).

By never stalling the GPU, triple-buffering maintains higher average framerates than double-buffering. Essentially, it's allowing the game to render at the intermediate framerates (between the previously-mentioned integer fraction framerates), while still maintaining vsync. Of course, because these intermediate rates don't align with a fraction of the refresh, the result tends to be more juddery than with double-buffering

It's not strictly correct to say that triple-buffering gives better input lag than double-buffering. Double-buffering can actually be better if a game is consistently hovering at just faster than an integer fraction of the refresh, since the stalling forms a very good mechanism for timing frame kickoffs; the GPU will only stall for a moment before starting a new render, and frames will finish rendering just barely before the refresh. This having been said, triple-buffering should give lower input lag at most intermediate performance levels.

//=========================

Also, everything above regarding the tradeoffs pertains to theoretical ideal cases. Real-world implementations sometimes have issues that make one option or another a more clear winner or loser. There's also shenanigans with regard to pipelining graphics rendering across several frames that isn't totally accounted for. Also, in the real world there are a bunch of alternative compromise approaches, like "double-buffer vsync as long as maintaining 60fps, turn vsync off whenever drops occur."
 

Tain

Member
shockdude said:
Out of curiosity, would you be able to test any notable visual or latency differences of GSync 120Hz vs. No-VSync 120Hz, or even GSync 60Hz vs No-VSync 60Hz? If you can't notice any difference (besides tearing, obviously), that's ok.

For shits, I just did a 60fps camera test (not the best, I know) of MAME using my ROG Swift the other day. No sync vs Vsync vs Gsync. At 60hz for some reason.

No sync and Gsync had the same results, Vsync had an additional 3-4 frames of lag on the regular (at 60hz). Nothing unexpected, but it was the first time I did any kind of real lag testing for gsync.
 

pottuvoi

Banned
Triple buffering can actually add up to a frame of latency if it doesn't allow dropping un-used frames.

ie.
Game has framerate of 240 on 60hz screen, if triple buffer allows reuse of backbuffers and just renders new frame until it's time to show new image the monitor shows 60 images out of 240 fully rendered ones. (reduces lag when compared to normal v-sync)

If it only allows starting a new frame if previous was finished before the refresh it is still locked to maximum of 60hz and most likely causes additional lag.

Triple buffering can also cause horrid framerate jittering when compared to normal V-Sync. (which usually drops framerate to next possible stable framerate and keep it there.)
In 45fps it will show every other frame 33ms and every other 16ms. (on 60hz display)


Pre-Rendered Frames is very important when talking about additional lag, especially when using v-sync.
This is setting which tells how many frames of data CPU creates into a queue to send to a GPU so it would not stall due to slow CPU.

In most cases where mouse is very un-responsive with V-Sync is due to slower than optimal framerate combined with high setting in PRF. (30fps game and 3 frames in queue is ~100ms of latency of which at least 66ms is additional due to this setting..)


Fun, note about tearing.
If framerate is ridiculously high in comparison to a display hz.it is very similar to rolling shutter on camera.
Frame limiting should be used with vsync off and be 1 fps lower than your refresh rate.

- same input lag as no vsync
- it eliminates judder
- shouldn't screen tear
- doesn't force video cards to 100% all the time (sometimes quieter)
Without proper sync all framerates tear, question is how noticeable it is and does the tear line move.
 

shockdude

Member
Also, everything above regarding the tradeoffs pertains to theoretical ideal cases. Real-world implementations sometimes have issues that make one option or another a more clear winner or loser. There's also shenanigans with regard to pipelining graphics rendering across several frames that isn't totally accounted for. Also, in the real world there are a bunch of alternative compromise approaches, like "double-buffer vsync as long as maintaining 60fps, turn vsync off whenever drops occur."

Thanks for the detailed info, linked to your post in OP and added adaptive VSync.
For whatever reason the games I play will play at any FPS even with VSync enabled. Not sure why, maybe it's RadeonPro? Or maybe I haven't tested enough games.
Also imo a synced jittery 40-50FPS @ 60Hz is still a better alternative than screen tearing, and is acceptable when it's infrequent.

For shits, I just did a 60fps camera test (not the best, I know) of MAME using my ROG Swift the other day. No sync vs Vsync vs Gsync. At 60hz for some reason.

No sync and Gsync had the same results, Vsync had an additional 3-4 frames of lag on the regular (at 60hz). Nothing unexpected, but it was the first time I did any kind of real lag testing for gsync.
That's awesome. Man I hope gsync/freesync become mainstream soon.

Triple buffering can actually add up to a frame of latency if it doesn't allow dropping un-used frames.

ie.
Game has framerate of 240 on 60hz screen, if triple buffer allows reuse of backbuffers and just renders new frame until it's time to show new image the monitor shows 60 images out of 240 fully rendered ones. (reduces lag when compared to normal v-sync)

If it only allows starting a new frame if previous was finished before the refresh it is still locked to maximum of 60hz and most likely causes additional lag.

Triple buffering can also cause horrid framerate jittering when compared to normal V-Sync. (which usually drops framerate to next possible stable framerate and keep it there.)
In 45fps it will show every other frame 33ms and every other 16ms. (on 60hz display)

Makes sense, and it does explain why triple-buffering @ 60FPS has more input lag than no-vsync @ 60FPS. Unfortunately I don't have a computer powerful enough to do a proper test at higher framerates.

Granted this doesn't explain how RadeonPro removes that extra frame of input lag with triple-buffering. I still think it's magic lol.

Pre-Rendered Frames is very important when talking about additional lag, especially when using v-sync.
This is setting which tells how many frames of data CPU creates into a queue to send to a GPU so it would not stall due to slow CPU.

In most cases where mouse is very un-responsive with V-Sync is due to slower than optimal framerate combined with high setting in PRF. (30fps game and 3 frames in queue is ~100ms of latency of which at least 66ms is additional due to this setting..)

I played around with Nvidia's DRF and it didn't seem to do much with the VSynced games I tried (Super Meat Boy, L4D2).
 

loganclaws

Plane Escape Torment
Play street fighter 4 on pc, you'll immediately notice the effect of vsync on input lag. Even gsync on a 144hz monitor introduces input lag, albeit less then vsync.
 

UnrealEck

Member
In many games I have noticed input response decrease with vsync on. I'm sure it's not placebo either and that there's plenty of proof in testing with software to monitor things.
 

algert

Banned
Magic or Placebo? And other input lag questions.

Basically, what are people’s experiences with these tricks above? Do they work? Are there any more that I’m unaware of? Is this topic even threadworthy? And so on.

There are a lot of input lag scenarios that I’d like to know more about, or even know nothing about, but don’t have the resources to comprehensively and/or accurately test. I’ve describe most of the scenarios above, but here are a couple more.
Does windowed gaming come at the cost of input latency compared to fullscreen? Will VSync+windowed affect input lag? Will RadeonPro+Windowed do anything?
What do other tools (MSI Afterburner, D3DOverrider, etc.) do that RadeonPro can’t do?
How is input latency when using GSync?
Is it possible to construct an input-lag-testing robot that sends a keyboard command and then takes a picture of the monitor?

If you'd like to minimize input lag then don't use v-sync in any form and play in fullscreen, that's all there is to it. Frame limiting will add delay, relative to an unlimited framerate. Playing in a fullscreen window can add delay, relative to fullscreen. Personally, I don't perceive any appreciable screen-tearing at a stable 60 fps in any of the games I play. The "judder" (pulldown) you refer to doesn't exist outside of pre-rendered cutscenes. If you're instead referring to stutter, then I'd suggest that you're simply experiencing a variable framerate. Adjust settings until framerate is stable and matches refresh rate in that case.

e: Adjust settings until framerate is stable at refresh rate or exceeds it.
 

KKRT00

Member
In many games I have noticed input response decrease with vsync on. I'm sure it's not placebo either and that there's plenty of proof in testing with software to monitor things.

I have noticed other way around, most games i've activated v-sync i got more input lag, examples are CSS, BF 3, Skyrim or RAGE.
I'm very sensitive to input lag and it annoys me greatly, thats why i prefer tearing over input lag any day of the week.

Thanks god for G-Sync :)
 

Vash63

Member
I can't stand vsync in any mouse driven game. Too much lag.

Luckily I bought Gsync, so I only have this issue with games too stubborn to have a full screen mode (I only know of two games without full screen, StarCraft 2 and Evil Within).
 
TL;DR: I use RadeonPro’s “Lock framerate to refresh rate” option to eliminate input lag with VSync+Triple Buffering. Perfect 60FPS with almost no input lag feels amazing. What about you? What other options are there?
Edit: Seems like I got confused.

For competitive games I use NO vsync with frame rate limit.

Can't wait for FreeSync.
 

GoaThief

Member
In many games I have noticed input response decrease with vsync on. I'm sure it's not placebo either and that there's plenty of proof in testing with software to monitor things.
That is a blatant falsity due to how VSync works.

I am sensitive to input lag with a mouse and keyboard, if I'm playing with them I have absolutely no choice but to turn it off as I cannot aim with it on after years of conditioning in competitive FPS. When using a controller it's a one frame buffer with double buffering (I find triple can add more latency), although with a controller I'm far less sensitive to input lag.
 

Seanspeed

Banned
I'm clearly not sensitive to this at all. At least right now. I don't think I've ever noticed input lag. And I play games like Battlefield and racing sims with vsync on quite happily. I certainly don't play or drive any better with it off.
 

Elija2

Member
I've got a question: If I'm playing a game and my framerate fluctuates between 30 and 60 but I don't want to lock my framerate to 30 is it better to turn on double buffering, triple buffering, or to keep it all off?
 

Durante

Member
I'd write a long post here, but I have to get to work (and this week I actually have to work at work. the horror :p).

So the short of it:
  • Graphics drivers are deeply pipelined because that's what gives you high FPS scores in benchmarks.
  • Real triple buffering drops frames and always reduces or at worst maintains input lag compared to double buffered vsync, but people often call adding an additional buffer to the queue "triple buffering" and that adds latency.
  • What I suspect external framerate limiting does in some games is delay the point in time where they sample input until just before they start rendering the frame (resulting in frames always having the impact of the most recent input sampling built-in).
  • if you care at all about input lag, reduce your "number of frames to render ahead" setting.
  • It's actually perfectly possible to have less input lag than a perfect "locked" refresh rate framerate with correctly implemented triple buffering (e.g. with 120FPS on a 60Hz monitor you will have less input lag than with 60 FPS).

Say what? Triple buffering = three frames added lag. Simple as that.
Bull.
 

GoaThief

Member
Agree with quite a lot of what you say there Durante, although I'd say if you care about input lag then the only real option is to disable VSync and that in my personal experience nearly any experiment I've attempted with triple buffering definitely adds an additional buffer (not "real" triple buffering as you called it) resulting in higher latency.

If as you say it's possible to have lower input lag with "real" triple buffering over double (disabled VSync is still preferable IMHO), you're going to run into consistency issues which is probably my next biggest bugbear. Although that will matter less after a point if you're well into the hundreds of frames a second. The latter is a different discussion however so maybe best left for elsewhere.

Out of interest could you name some titles which use "real" triple buffering? I know the source engine doesn't use it, I unfortunately can't recall which other games I've tried it with as it was some time ago.

This. It's the one setting that will virtually always decrease input lag. It works wonders, especially when opting to use a 30fps lock.
If you absolutely must play with Vsync on this is essential IMHO, probably alongside double buffering due to the above issues.
 

Arkanius

Member
I'd write a long post here, but I have to get to work (and this week I actually have to work at work. the horror :p).

So the short of it:
  • Graphics drivers are deeply pipelined because that's what gives you high FPS scores in benchmarks.
  • Real triple buffering drops frames and always reduces or at worst maintains input lag compared to double buffered vsync, but people often call adding an additional buffer to the queue "triple buffering" and that adds latency.
  • What I suspect external framerate limiting does in some games is delay the point in time where they sample input until just before they start rendering the frame (resulting in frames always having the impact of the most recent input sampling built-in).
  • if you care at all about input lag, reduce your "number of frames to render ahead" setting.
  • It's actually perfectly possible to have less input lag than a perfect "locked" refresh rate framerate with correctly implemented triple buffering (e.g. with 120FPS on a 60Hz monitor you will have less input lag than with 60 FPS).

Bull.

Any facts about Borderless Fullscreen? I have been using that instead of Fullscreen + Vsync whenever I can.
 
So if i'm using framelimiting software like rivatuner at 60fps on a 60Hz display i'm doing it wrong and should use 59fps? Can somebody elaborate as to why this is better dont you get a duplicate frame and judder then?

I was already very happy when I discovered borderless window+ constant frametime gaming, it was so muuuch smoother vs standard fullscreen vsync 60fps(with irregular frametimes). I'm not too sensitive for inputlag I think so i can't comment on that.
 

UnrealEck

Member
I have noticed other way around, most games i've activated v-sync i got more input lag
No that's what I'm saying. I see the response of my input decrease. The controls become less responsive. More laggy. Higher latency.
I think I worded it a bit poorly.
 

Dunkley

Member
I always wondered how frames rendered ahead works in relation to Vsync, but I am not too versed on the technical side so sorry if this comes off as really dumb.

So assuming Vsync is a constant buffer switcharound with one frame rendering while the other is shown, triple buffering works with two buffers being utilized, how exactly do prerendered frames come into play there?

Are they stored elsewhere and just thrown into the buffers after eachother, or are they additional buffers added to the vsync buffers that fulfill the same role as the vsync frame buffers?

PS: To ensure the least amount of input lag on adaptive / no vsync, how many frames should be rendered ahead? Can I leave the setting to be set to 1 without any issues or does this hurt performance/cause input lag more than having it set to 2 or 3?
 
I've found that using different methods like vsync, borderless, triple buffering etc. produces quite a variety of results depending on the game. Most of the time games don't explicitly state how the vsync implementation actually works. Also windowed mode vs. full screen can have plenty of impact to FPS, or not at all. Sometimes frame rate caps help too, especially with stuttering.

Personally I've moved on to playing most of my games with a controller, and if vsync does increase input lag, I'm not really bothered about it. Only when a game uses vsync that only accepts 60/30 FPS I will go no vsync with borderless if possible, or adaptive vsync. In my experience I used to go vsync off always, but these days the vsync implementations can work really well. For example in DA:I the best visual quality and performance I got was with full screen + vsync. Borderless works ok but reduces framerates somewhat.
 

HTupolev

Member
[*]Real triple buffering drops frames and always reduces or at worst maintains input lag compared to double buffered vsync
As I detailed above, this isn't strictly true. Imagine the situation where frames are taking exactly 16ms to render, with a 60Hz video signal, and where the time it takes for the CPU to tick the game logic is negligible.

In the double-buffered case, frames will always kick off on a video refresh, and finish rendering exactly .67ms before the next refresh, resulting in frames becoming the front buffer just 16.67ms after kicking off.

In the triple-buffered case, some frames will do that, and some will even do slightly better... but you'll also have circumstances where a frame kicked off 2ms after a refresh, isn't ready for the next refresh, and frame after it doesn't make the third refresh, so the first frame gets displayed on said third refresh (almost two full refreshes after kicking off). The graphical buffering delay will be evenly-distributed across the whole range, so in this scenario, the double-buffered case averages almost half a refresh better than the triple-buffered case.

The triple-buffered case does wind up taking the lead as performance rises*, though, ultimately approaching a half-refresh advantage over the double-buffered approach. (And based on a simulation I did a little while ago, triple-buffered should have the advantage under most performance levels.)

*Within the intervals between integer fractions of the refresh, with performance above refresh behaving as its own range.
 

co1onel

Member
V-sync will absolutely murder my timing in any music game, and I can notice it in most other types of games. I usually don't bother with it.
 

tioslash

Member
As much as I don´t like input lag, I absolutely despise screen tearing. That said, I make some compromises:

If I´m playing a multiplayer game (like BF4, CS) , I always turn V-sync OFF and deal with the tearing;

If it´s a single player game than it really depends, but more often than not I try to adjust the settings so I can get a locked 60FPS or at least as close as possible to that and play it with V-sync ON, even if that introduces a bit of input lag.
 

Arulan

Member
There is a lot to cover with regards to this topic so I'll try to be thorough.

Let's start with Triple-Buffered Vsync, the correct implementation allows for dropping of frames, whereas the adding-an-additional-buffer-to-the-Vsync-queue version does not. Unfortunately most people refer to the latter as triple buffering, and is the form used in most games. The former is achievable by using windowed mode in Windows.

real Triple-Buffered Vsync

+ Removes screen-tearing
+ Minimal input latency
- Judder

Unfortunately real triple-buffering is similar to "No Vsync" in its output of frame times resulting in motion judder.

common Triple-Buffered Vsync

+ Removes screen-tearing
+ Higher "average" frame rate than double-buffered Vsync
+ Potential for perfect motion
- Potential for significant input latency

What I mean by potential is that it depends on what your frame rate is relative to your refresh rate. If your frame rate is consistently under your refresh rate then the "back buffers" won't fill up, which is what causes the additional input latency (that is filling them up causes increased input latency). High-refresh displays in particular have an advantage because it's much harder for your frame rate to surpass the refresh rate. As for the potential for perfect motion, if your frame rate consistently matches your refresh rate then your output of frame times will be consistent.

Double-Buffered Vsync

+ Removes screen-tearing
+ Potential for perfect motion
+ Reduced input latency when compared to "common triple-buffered Vsync"
- Increased input latency when compared to "No Vsync"

As with common triple-buffering it has potential for perfect motion, meaning no judder if and when you can consistently match frame rate with refresh rate.

No Vsync

+ No additional input latency
- Judder
- Screen-tearing

As for frame limiters and caps: If you're using a form of Vsync capping your frame rate one under your refresh rate (59 for example) will reduce input latency because it prevents your back buffers from filling up and stalling. However, this introduces judder. Capping your frame rate to 60 instead helps to reduce input latency without introducing judder, but the reduction in input latency isn't as large as with the 59 cap.

If you're using "No Vsync" capping your refresh rate isn't usually worth it unless the increased frame rate is causing your CPU to stress itself out too much causing inconsistency. If you do cap it, there are certain values to aim for and avoid. Too close to the refresh rate (and 2x, 3x, etc.) /harmonic frequencies will cause it to tear more violently. If you can find the right value (of which I don't remember) you can move the tear line very close to off-screen.

In my experience I've found that the absolute best method for capping your frame rate is to use RivaTuner Statistics Server (it comes with MSI Afterburner).

If you want to cap your frame rate below your refresh rate, such as 30 on a 60Hz monitor, use half-refresh standard Vsync through Nvidia Inspector (or I believe "Double Vsync" through AMD's software).

I've usually found in-game Vsync to be unreliable both in terms of input latency and motion judder. I would recommend using Vsync through Nvidia Inspector, or using D3DOverrider (which supports both double-buffered and "common triple-buffered Vsync"). When comparing double-buffered Vsync between Nvidia's drivers and D3DOverrider, the latter has less input latency, but I believe Nvidia's solution is overall more consistent in terms of lack of motion judder.

A few scenarios and my recommendations:

If you want a 30 fps cap on a 60Hz monitor: Half-refresh standard Vsync (Nvidia Inspector) + RivaTuner Statistics Server cap to 30.

Game where you can consistently hit 60 and prioritize smooth motion: Double-buffered Vsync (either NVInspector for consistent motion or D3DO for slightly reduced input latency) + RivaTuner Statistics Server cap to 60.

Game where you fluctuate and cannot achieve 60, but do not want to cap to 30: Borderless-windowed mode + disable Vsync in-game + RivaTuner Statistics Server cap at 60.

Keep in mind a high-refresh rate monitor (such as 120Hz) will significantly reduce the input latency with Vsync all-around. For the most part I do not recommend "common triple-buffering" as I feel it is outclasses by double-buffered Vsync when perfect motion is a possibility because of the higher input latency, and doesn't surpass "real triple-buffering" when perfect motion isn't a priority.

I hope I didn't forget anything. ;)
 

HTupolev

Member
If your frame rate is consistently under your refresh rate then the "back buffers" won't fill up causing additional input latency. High-refresh displays in particular have an advantage because it's much harder for your frame rate to surpass the refresh rate.
Your second sentence is correct, although your first is wrong and disagrees with it. The issue with "common triple buffering" is if your buffers do fill up. In that case, at the next refresh, the game will display a frame from the "next" buffer even though the "next next" buffer has a newer frame ready.
 

Arulan

Member
Your second sentence is correct, although your first is wrong and disagrees with it. The issue with "common triple buffering" is if your buffers do fill up. In that case, at the next refresh, the game will display a frame from the "next" buffer even though the "next next" buffer has a newer frame ready.

Correct, I meant to say "won't fill up, which is what causes the additional input latency", implying that filling them up causes the input latency.
 

Durante

Member
Good post Arulan!

As I detailed above, this isn't strictly true. Imagine the situation where frames are taking exactly 16ms to render, with a 60Hz video signal, and where the time it takes for the CPU to tick the game logic is negligible.
I see what you mean, but I think this case would be exceedingly rare. Still, correct and noted.

The triple-buffered case does wind up taking the lead as performance rises*, though, ultimately approaching a half-refresh advantage over the double-buffered approach. (And based on a simulation I did a little while ago, triple-buffered should have the advantage under most performance levels.)
Why only a half-refresh advantage? With, say, 600 FPS triple-buffered at 60 Hz you should at most get ~3ms of lag, which is much less than 16.6/2.
 
Top Bottom