• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC VSync and Input Lag Discussion: Magic or Placebo?

Arulan

Member
I'm confused by part of this.

If you have an 120 Hz (refresh rate) monitor and your framerate is much lower than 120 Hz, that will have lower input latency than otherwise? If you have a 60 Hz monitor and your framerate is much HIGHER than 60 Hz, that means you'll have HIGHER input latency?

Both of those examples are the opposite of what I would expect.

Not exactly, the input latency derived from your frame rate (frame time) will will be lower the higher frame rate you have. However, when your frame rate goes over the refresh rate, due to how Vsync works, it causes the back buffers to fill up and stalls the GPU. So it's not about having much lower frame rate (below your refresh rate), as long as you're at 59 (for a 60Hz refresh display) it'll prevent the back buffers from filling up. However, capping to 59 will cause judder to occur, so it's best to just use 60 (if you're aiming for 60fps on a 60Hz display) to eliminate some of the input latency of going over the refresh rate with Vsync without introducing judder.
 

riflen

Member
I'm confused by part of this.

If you have an 120 Hz (refresh rate) monitor and your framerate is much lower than 120 Hz, that will have lower input latency than otherwise? If you have a 60 Hz monitor and your framerate is much HIGHER than 60 Hz, that means you'll have HIGHER input latency?

Both of those examples are the opposite of what I would expect.

Yes, you've got it. Arulan's scenarios are correct as I understand. There's always latency, but additional latency can be caused with Vsync when the GPU completes frames at a higher rate than the display is scanning out at. Instead the frames are queued and shown when possible.
Of course, the higher the rate the display can scan out at, the harder it is for the GPU to full up that queue. You can mitigate this by using a faster monitor, stressing the GPU more, or limiting the maximum frame rate of the game.

This shouldn't really happen if you're utilising a system that allows frames to be discarded if they're not required. True triple-buffering allows for frames to be discarded from the back buffers, where as the multi-buffer queuing system that's commonly used in Direct3D games does not discard frames, I think.
 

Arulan

Member
Another topic I forgot to mention in my original post: SLI (and Crossfire)

Unfortunately I'm not as familiar with this, and therefore not completely confident, so if anyone has further insight please feel free to post.

With SLI you will always be adding additional input latency (I believe it's one frame) but that can be mitigated if you're rendering at twice the frame rate relative to that of a single card. SLI micro-stutter despite having improved vastly over the years is still going to be present to some degree, and will effect the motion smoothness. To what degree depends on a lot of things, notably the game itself, and some users may not be able to appreciate the difference in others.

With regards to Vsync, due to how AFR works I believe "common Triple-Buffered Vsync" does not work unless you have an odd number of cards in SLI. I also believe there are issues with "real Triple-Buffered Vsync" and windowed mode in general not working. Again, I can't say this with complete confidence, so if anyone can further elaborate that would be great.
 

riflen

Member
Another topic I forgot to mention in my original post: SLI (and Crossfire)

Unfortunately I'm not as familiar with this, and therefore not completely confident, so if anyone has further insight please feel free to post.

With SLI you will always be adding additional input latency (I believe it's one frame) but that can be mitigated if you're rendering at twice the frame rate relative to that of a single card. SLI micro-stutter despite having improved vastly over the years is still going to be present to some degree, and will effect the motion smoothness. To what degree depends on a lot of things, notably the game itself, and some users may not be able to appreciate the difference in others.

With regards to Vsync, due to how AFR works I believe "common Triple-Buffered Vsync" does not work unless you have an odd number of cards in SLI. I also believe there are issues with "real Triple-Buffered Vsync" and windowed mode in general not working. Again, I can't say this with complete confidence, so if anyone can further elaborate that would be great.

I have some idea here. First, AFR SLI is only supported in Full Screen modes, although windowed modes seem to kind of work with poor scaling in some games.

AFR SLI requires the use of additional buffers in the common queue. I believe if you're using AFR SLI mode, you cannot reduce the number of buffers in the queue to fewer than 3. I'm not sure about the behaviour with > 2 cards, but with 2 cards in AFR mode, games can always run at an arbitrary frame rate with Vsync enabled. This suggests there are at least 3 buffers in this mode.
The queue is required to allow decent performance scaling I think. There's a Nvidia PDF online that gives some insight:

By default, Direct3D allows buffering up to three frames' worth of work before Direct3D calls will start stalling. Historically, this was done to grant a performance benefit on single GPU systems by allowing overlap of CPU and GPU work. SLI configurations require this buffering of frames to achieve performance scaling in AFR mode.
The drawback of this buffering is that it introduces additional latency between the user's input and its results being visible onscreen (often referred to as “input lag”). The overall goal of SLI is to achieve the greatest performance benefit while minimizing perceived input lag.

Some applications try to minimize the amount of frame buffering to reduce input latency. A common approach to do this is the use of Direct3D Event queries: the application inserts an Event query at the end of every frame and then, one frame later for example, stalls by checking the state of this query on the CPU until the Event has been reached by the GPU. This allows the application to explicitly control how many frames are buffered, but also prevents AFR scaling.

Some interesting stuff in it: http://developer.download.nvidia.com/whitepapers/2011/SLI_Best_Practices_2011_Feb.pdf
 
Is there some reason that D3DOverrider doesn't want to work for me? I run it and it just runs in the background, I get no GUI or anything to edit the settings. I have to close it using Task Manager.

Otherwise, people keep mentioning that MSI Afterburner can force triple buffering but I am not seeing any option. I just limit the framerate to 59 but I still get tons of tearing.
 

Arulan

Member
Is there some reason that D3DOverrider doesn't want to work for me? I run it and it just runs in the background, I get no GUI or anything to edit the settings. I have to close it using Task Manager.

Otherwise, people keep mentioning that MSI Afterburner can force triple buffering but I am not seeing any option. I just limit the framerate to 59 but I still get tons of tearing.
You should see D3DO in your notification tray. Afterburner comes with RivaTuner Statistics Server which can be used as a frame limiter but it doesn't have any way to force Vsync, that's what D3DO is for.
 

Soodanim

Member
I tried playing Dark Souls borderless fullscreen, but I couldn't dodge shit. That's the only game I've tried where timing matters, but I'm convinced by my experience with it. It wasn't just a 5 minute test, either. I had to adjust when I rolled to actually dodge.
 
Frame limiting should be used with vsync off and be 1 fps lower than your refresh rate.

- same input lag as no vsync
- it eliminates judder
- shouldn't screen tear
- doesn't force video cards to 100% all the time (sometimes quieter)

Some cons?

In Some games since your limited in frame rate it's hard to make certain jumps, etc. It depends on the engine but Source is a good example with Counter Strike.

With my old crossfire setup this was the only way to play FarCry 3, and quite a few other games because of the hair pulling judder.

Dtoxy or something, afterburner, etc. offer frame limiting.

this still screen tears if vsync is off. on my 60 hz lcd, capping fps to 59 via afterburner and turning on vsync eliminates tearing and doesnt add any input lag i can actually notice. standard vsync with no framecap has horrendous input lag.
 

geordiemp

Member
TL;DR:

[*]VSync: More specifically known as double-buffered VSync. Rendered image goes through a buffer before going to the monitor. When implemented properly, eliminates screen tearing and syncs the framerate with the monitor refresh rate for perfect 60Hz smoothness. Usually causes the most input lag (2-3 frames?). May not respond well to framerate dips (e.g. jittering/uneven frame timing, drop to 30FPS instead of to 40-55FPS).

Also, here's a decent article from AnandTech where they test various input lag situations with a high-speed camera (ty riflen).

Sorry to pose a question here, but I often read on GAF that 60 Hz is preferred due to lower latency of input and smoothness and that consoles suffer with 30 Hz due to input lag even if its a consistent 30 Hz with motion blur and the game is optimised for input lag at that frame rate.

The forum type comments of lower input lag at 60 Hz (for a game developed at 30 Hz) and your statement seem to suggest the 60 Hz option on PC for 30 Hz console games does not lower input lag ?
 

Arulan

Member
Sorry to pose a question here, but I often read on GAF that 60 Hz is preferred due to lower latency of input and smoothness and that consoles suffer with 30 Hz due to input lag even if its a consistent 30 Hz with motion blur and the game is optimised for input lag at that frame rate.

The forum type comments of lower input lag at 60 Hz (for a game developed at 30 Hz) and your statement seem to suggest the 60 Hz option on PC for 30 Hz console games does not lower input lag ?

I think you mean to use "fps" instead of "Hz" in your reply, but to put it simply the higher frame rate you have, the lower the derived input latency from that frame rate (excluding Vsync and external factors for the moment). For example, if you're playing a game at a consistent (for the sake of argument I'll be using consistent frame rates) 30 fps you have an inherent frame latency (frame time) of 33.3ms. This is the time it takes each frame to be swapped to the front buffer (read: not physically displayed because we're excluding external factors for the moment). If you're playing at a consistent 60 fps, each frame has a latency of 16.7ms.

Now, the difference for the moment is relatively minor (although noticeable), but once you introduce Vsync, and multiple buffers into the mix you now have additional input latency of multiple frames (let's use his example of 2-3 for this argument). 2-3 frames of input latency and you now have 66.7ms - 100ms of input latency for 30 fps, and 33.3ms - 50ms for 60 fps. The difference is much more severe now.
 

Nokterian

Member
I do want to know if it is relevant to ask here i got huge fps drops when large packs or with 4 people are on screen in diablo 3 i have v-sync off in-game forced through nvidia settings adaptive v-sync i have BenQ XL2411Z 144hz monitor and i can't seem to keep a good framerate.
 

geordiemp

Member
I think you mean to use "fps" instead of "Hz" in your reply, but to put it simply the higher frame rate you have, the lower the derived input latency from that frame rate (excluding Vsync and external factors for the moment). For example, if you're playing a game at a consistent (for the sake of argument I'll be using consistent frame rates) 30 fps you have an inherent frame latency (frame time) of 33.3ms. This is the time it takes each frame to be swapped to the front buffer (read: not physically displayed because we're excluding external factors for the moment). If you're playing at a consistent 60 fps, each frame has a latency of 16.7ms.

Now, the difference for the moment is relatively minor (although noticeable), but once you introduce Vsync, and multiple buffers into the mix you now have additional input latency of multiple frames (let's use his example of 2-3 for this argument). 2-3 frames of input latency and you now have 66.7ms - 100ms of input latency for 30 fps, and 33.3ms - 50ms for 60 fps. The difference is much more severe now.

Understood thanks, hz and fps is very interchangeable and lazy on my behalf....

so that means (ducks for cover) that a well optimised 30 FPS console version could have less input latency to play (say 33 ms if its well made) than a PC port at 60 FPS having to use an external vsync option that maybe 66 ms ?

Obviously if the game if made for 16 ms on console and pc, 60 fps then everybody wins.
 

Portugeezer

Member
I always notice lag, like in BF Hardline beta, the input lag seemed small but I am sure it was there. For shooters where you go from using a responsive desktop mouse to input delayed game with vsync it is noticeable.
 

Arulan

Member
so that means (ducks for cover) that a well optimised 30 FPS console version could have less input latency to play (say 33 ms if its well made) than a PC port at 60 FPS having to use an external vsync option that maybe 66 ms ?

No, I'm not sure you understood my explanation. For the most part, 30 fps will have roughly double the input latency of a 60 fps game. Of course, if you start comparing extremes like "No Vsync 30 fps" to "common Triple-Buffered Vsync 60 fps exceeding refresh rate" then the difference isn't as large, but for the most part 30 fps will always have more input delay.
 

DeSo

Banned
Okay, so give it to me straight, what program should I use if I intend to use v-sync with triple buffering?
 

shockdude

Member
Okay, so give it to me straight, what program should I use if I intend to use v-sync with triple buffering?
The answer will vary depending on who responds first.

Try RadeonPro. Also try the "lock framerate to refresh rate" option in addition to triple buffering.
 

dmr87

Member
Thanks for the thread, I've allways dabbled with this but without knowing the technical aspect behind it. Some nice information in here.
 

Durante

Member
so that means (ducks for cover) that a well optimised 30 FPS console version could have less input latency to play (say 33 ms if its well made) than a PC port at 60 FPS having to use an external vsync option that maybe 66 ms ?
No. A 30 FPS game is always less responsive than a 60 FPS game, assuming the same type of frame pacing / vsync. And (as this thread amply describes) you have a lot of options to tweak everything for the lowest possible input latency on PC.

Okay, so give it to me straight, what program should I use if I intend to use v-sync with triple buffering?
Use a program that forces borderless fullscreen windowed mode (or set it in the game itself if it supports it).

And if you feel a given game has juddering or input lag issues with just that, try using the framerate limiter in RIva Tuner Statistics Server.
 
I've been using D3DOverrider + in-game Vsync for the longest time for "common" triple buffered Vsync, should I just switch over to borderless windowed?

What I like about D3DOverrider is that I don't have to setup profiles or enable it for every game, I just leave it on globally and if it detects a 32-bit game it just works. If there's a conflict then I can just add an exception. The downside is that more and more games will be 64-bit only now and D3DOverrider is useless for them. RadeonPro triple buffering didn't work when I tried it on Shadow of Mordor, but thankfully it has a built in borderless windowed option.

Because I have an Nvidia GPU, RadeonPro doesn't let me set global profile and I don't want to create profiles for every game that I want triple buffering in, which is all of them. RadeonPro is running so I can use SweetFX with D3DOverrider (these normally don't play nice together) or if I need to force SMAA.

Some might find this weird, but the fewer external programs I need to have running the better. I don't like having to manage settings in a bunch of different programs, and if I can globally force settings painlessly than I'd rather do that. Framerate limiting can be done in Nvidia Inspector so I'd rather do it on a driver level than run MSI Afterburner and RTSS, unless there's a huge difference between the functionality of the two. I am willing to give Borderless Gaming a shot, is it worth ditching D3DOverrider for it?

One more thing, I've never messed around with the Maximum Pre-rendered Frames option but since I hate tearing and use Vsync in everything, would I be safe just setting this 1 on the global profile?

Edit: Okay, I just tried out the Borderless Gaming program, does borderless windowed not work with DSR?
 

jorimt

Member
Edit: Okay, I just tried out the Borderless Gaming program, does borderless windowed not work with DSR?

Sadly that's a limitation I believe has yet to be addressed in this thread.

As far as I know, both native and third-party borderless windowed solutions do not support external resolution downsampling. However, borderless windowed mode (Durante, please correct me if I'm wrong here) can work with downsampled resolutions in GeDoSaTo, but only because said resolutions in this case are internal.

One more thing, I've never messed around with the Maximum Pre-rendered Frames option but since I hate tearing and use Vsync in everything, would I be safe just setting this 1 on the global profile?

If set to "1," max pre-rendered frames can decrease input lag by reducing the number of prepared frames sent from the CPU to the GPU, but this only guarantees less input lag if the game is preparing more than 1 frame at default; some already prepare less than three to start with, I think. You can set it to "1" globally, but, in my experience, some game engines (depending on your setup) don't always play nicely with it set so low, and it can sometimes cause stutter issues. So setting it per profile is still probably the best way to handle it.
 
Sadly that's a limitation I believe has yet to be addressed in this thread.

As far as I know, both native and third-party borderless windowed solutions do not support external resolution downsampling. However, borderless windowed mode (Durante, please correct me if I'm wrong here) can work with downsampled resolutions in GeDoSaTo, but only because said resolutions in this case are internal.



If set to "1," max pre-rendered frames can decrease input lag by reducing the number of prepared frames sent from the CPU to the GPU, but this only guarantees less input lag if the game is preparing more than 1 frame at default; some already prepare less than three to start with, I think. You can set it to "1" globally, but, in my experience, some game engines (depending on your setup) don't always play nicely with it set so low, and it can sometimes cause stutter issues. So setting it per profile is still probably the best way to handle it.
Thanks for the response, that's a shame that I have to use yet another program to raise the internal resolution if I want to use borderless windowed. Of course I could also raise the desktop resolution before starting a game, but that's a pain in the ass. I guess I'll check of GeDoSaTo at some point, Durante makes good stuff.
 

GoaThief

Member
So setting it per profile is still probably the best way to handle it.
Wouldn't it be best to set it globally to one frame then if problems arise, change the individual game profile (which overrides the global setting)?

Just a time saver really.
 
Thought I'd reply here as this seems to be the most appropriate discussion.

What's the purpose of limiting double-buffered v-sync to my screen's refresh (i.e 60fps/Hz)? When the back buffer is complete the GPU can't put another (newer) frame in there anyway so has to wait. Why does it need a frame limit?

The Anandtech example of triple-buffering uses a game running at 300fps with perfect pacing. It seems good to use TB here, but surely in a more typical scenario of frame times that are fluctuating between 60-80fps, why would I bother? All I stand to gain is some irregular pacing, right?

It would be good if this became a general thread for smooth motion. With that in mind! How high a dpi does a mouse need to be to be as smooth as a controller? Playing Bioshock infinite with a controller is hugely smoother than with a basic mouse for me with seemingly no way to up the dpi.
 

THEaaron

Member
VSYNC ALWAYS introduces heavy input lag for me. It does with tripple buffering, with locking the framerate, ... I am very sensitive to input lag. That is why I am a happy owner of a 144hz screen where tearing is nearly a non-issue.

There is literally nothing more worse than input lag for me.
 

shockdude

Member
What's the purpose of limiting double-buffered v-sync to my screen's refresh (i.e 60fps/Hz)? When the back buffer is complete the GPU can't put another (newer) frame in there anyway so has to wait. Why does it need a frame limit?

What's the purpose of limiting double-buffered v-sync to my screen's refresh (i.e 60fps/Hz)? When the back buffer is complete the GPU can't put another (newer) frame in there anyway so has to wait. Why does it need a frame limit?

The Anandtech example of triple-buffering uses a game running at 300fps with perfect pacing. It seems good to use TB here, but surely in a more typical scenario of frame times that are fluctuating between 60-80fps, why would I bother? All I stand to gain is some irregular pacing, right?

It would be good if this became a general thread for smooth motion. With that in mind! How high a dpi does a mouse need to be to be as smooth as a controller? Playing Bioshock infinite with a controller is hugely smoother than with a basic mouse for me with seemingly no way to up the dpi.
All I know is that I've used a cell phone camera to measure locking the framerate to the refresh rate, and locking the framerate reduces input lag by 3-4 frames at 60fps. I don't really know the technical reason.
Someone else in this thread made a post theorizing it had to do with syncing the frames to be right after the input window or something like that, but I can't find it.

At locked 60fps, I personally can't tell the difference between double and triple buffering. so I just use double-buffering for everything since I can still get intermediate framerates (e.g. 40fps, 55fps), it uses less VRAM, and it should have less input lag in theory.

I have a basic Logitech wireless mouse and it's plenty smooth. What symptoms are you seeing that make it "not smooth"?

VSYNC ALWAYS introduces heavy input lag for me. It does with tripple buffering, with locking the framerate, ... I am very sensitive to input lag. That is why I am a happy owner of a 144hz screen where tearing is nearly a non-issue.

There is literally nothing more worse than input lag for me.
Frame pacing isn't an issue at 144Hz? At 60Hz, jitter from high/low framerates is pretty noticeable, but I don't know how it is at higher refresh rates.
 
All I know is that I've used a cell phone camera to measure locking the framerate to the refresh rate, and locking the framerate reduces input lag by 3-4 frames at 60fps. I don't really know the technical reason.
Someone else in this thread made a post theorizing it had to do with syncing the frames to be right after the input window or something like that, but I can't find it.

At locked 60fps, I personally can't tell the difference between double and triple buffering. so I just use double-buffering for everything since I can still get intermediate framerates (e.g. 40fps, 55fps), it uses less VRAM, and it should have less input lag in theory.

I have a basic Logitech wireless mouse and it's plenty smooth. What symptoms are you seeing that make it "not smooth"?


Frame pacing isn't an issue at 144Hz? At 60Hz, jitter from high/low framerates is pretty noticeable, but I don't know how it is at higher refresh rates.
I can't quite get my head round how locking the framerate improves pacing while using v-sync. Would love to hear if someone knows the answer.

I basically just get very unsmooth motion aiming with the mouse compared to a control. I don't think it samples at a very high rate. I was considering buying a gaming mouse but wouldn't want it to not match the smoothness of the controller.
 
Hmmm. I often see people say something along the lines of "at 144 Hz you can't even see tearing anyway", but that hasn't been the case for me. I still see it clearly.

Side question: If I use vsync on, say, a 120 Hz monitor, will the input lag be only half of what it would be in a 60 Hz monitor because double the frames exist?
 

Unai

Member
Frame pacing isn't an issue at 144Hz? At 60Hz, jitter from high/low framerates is pretty noticeable, but I don't know how it is at higher refresh rates.

He probably has v-sync off, so no frame pacing issues. At 144Hz, screen tearing technically exists, but it's a lot harder to notice.

Side question: If I use vsync on, say, a 120 Hz monitor, will the input latency be only half of what it would be in a 60 Hz monitor because double the frames exist?

Yes. That's one of the reasons 120Hz is a lot better than 60Hz even to play 60FPS or 30FPS games.
 

daninthemix

Member
In my tests I found locking frame-rate to refresh rate either had no effect or was counter-productive. I've found nothing beats driver v-sync or half v-sync. What I do do is disable any other potential frame bottlenecks (in-game v-sync and frame limiters).

In terms of input lag, that's more usefully controlled by managing the max pre-rendered frames settings IMO, with 1 being ideal for most cases, perhaps higher if your CPU is struggling.
 
Yes. That's one of the reasons 120Hz is a lot better than 60Hz even to play 60FPS or 30FPS games.

Okay interesting. So if I'm playing a 60 fps locked game on my a 120 hz monitor, I still have only half the "downside" to enabling vsync that someone playing it on a 60 Hz monitor would have? Very cool.
 
In my tests I found locking frame-rate to refresh rate either had no effect or was counter-productive. I've found nothing beats driver v-sync or half v-sync. What I do do is disable any other potential frame bottlenecks (in-game v-sync and frame limiters).

In terms of input lag, that's more usefully controlled by managing the max pre-rendered frames settings IMO, with 1 being ideal for most cases, perhaps higher if your CPU is struggling.
I definitely get stutter in Bioshock Infinite when using v-sync if I don't cap to 60 in RTSS.
 
Something I've been struggling to understand for a while now:

If you enable common tripple buffering, but set the render ahead queue to 0, how is that any different than "real" triple buffering?
 

Unai

Member
Okay interesting. So if I'm playing a 60 fps locked game on my a 120 hz monitor, I still have only half the "downside" to enabling vsync that someone playing it on a 60 Hz monitor would have? Very cool.

Exactly.

Something I've been struggling to understand for a while now:

If you enable common tripple buffering, but set the render ahead queue to 0, how is that any different than "real" triple buffering?

I guess there's no difference at all. As a matter of fact, we can't even set it to 0 in the Nvidia Control Panel without forcing it with third-party tools. It might be useful if the game only has triple buffering but you want to force double buffering, I guess.

2015-07-1017_57_06-pa6guw8.png
 
I guess there's no difference at all. As a matter of fact, we can't even set it to 0 in the Nvidia Control Panel without forcing it with third-party tools. It might be useful if the game only has triple buffering but you want to force double buffering, I guess.

Oops, I meant to say 1, not 0. Presumably, zero would get rid of the actual "third buffer".
 

daninthemix

Member
Something I've been struggling to understand for a while now:

If you enable common tripple buffering, but set the render ahead queue to 0, how is that any different than "real" triple buffering?

Two things: triple buffering in the control panel has no effect on DirectX games, only OpenGL, and max pre-rendered frames is nothing to do with triple buffering - it's how many frames the CPU can prepare in advance for the GPU, while triple buffering is the GPU holding a buffer of 3 rendered frames.
 

Lagamorph

Member
I'll take a frame or two of input lag over screen tearing any day.

But then I don't play competitively and most of my online gaming is MMOs anyway where it's not such a big deal.
 

wbEMX

Member
I play a lot of online FPS, so I usually disable V-Sync. I use it when I'm playing games with a controller on my TV like Arkham City, Assassin's Creed or racing games. The input lag isn't very noticeable in these situations, but I definitely notice it in games where I use the mouse. I also have the issue that games never look as fluid as I like to. Lots of times I see judder, although I'm playing with FPS way over 60. I don't know why but the only games that felt as fluid as first-party Nintendo games were Battlefield 4 and CS:GO. Is this a frame-time issue? How can I fix it, if it's possible? Or is this just a giant placebo? Tearing isn't much of a problem.

System:
Intel Core i7-4790K @ 4,2GHz
NVIDIA GeForce GTX 970 OC
8 GB DDR3 RAM @ 1333MHz
 

nkarafo

Member
I'm still using a CRT monitor at 85hz... vsync always on with no noticeable input lag. And that smooth 85fps in fast paced games without blurring or ghosting... crystal clear and smooth.

I wish my CRT monitor lives forever.
 

Piggus

Member
The only games I always turn vsync off in are Source games like CSS and CS GO. It creates incredibly noticeable input lag, whereas without it I pretty much have a constant 300+ fps with minimal tearing.
 

Vex_

Banned
Glad this thread popped up as I have a question.

I was wondering about the whole turn on frame limiter thing in the OP. It says to place it at 1 fps lower than 60 (on a 60hz obviously) as this gets the best of both worlds.

Wouldnt that affect physics in a negative way without vsync? Havok tends to go crazy if the fps jumps around a lot without vsync iirc. Just curious as I have this particular game I am playing with a very particular mod... I am guessing this is ideal for more competitve old school shooters that dont have physics going on (CS)?
 
Just tried lots of these things in Battlefield 4. Ended up just disabling VSYNC. With 100+ fps the tearing isn't so bad on a 60hz monitor, but around 80-90 it jitters.

Putting pre-rendered frames on 1 while having ingame vsync on almost feels the same as vsync off, but it still isn't as snappy as vsync off. Sure, the game is very playable, but in a competetive shooter I would rather just have exact aiming as a possibility. Without a GSYNC monitor you will just have to learn to live with tearing and jitter. Even on 90fps it's like you are getting 50, but the aiming is precise as hell.

In games like DOOM and call of duty 2 (yes, 2) I do NOT get any tearing however, even when locking to an arbitrary 48 and 73 on my 60hz monitor. Makes my head explode.
 

shockdude

Member
oh boy this thread came back haha.

I've transitioned from RadeonPro to RivaTuner + ingame VSync, as RivaTuner is much more stable and reliable, and it supports locking the framerate to 60FPS globally.
That being said, I still experiment with RadeonPro from time to time. Most recently I've been trying to use RadeonPro's half-refresh VSync with The Crew, as Nvidia doesn't allow driver-level VSync control on laptops.
I wish there was a proper successor to D3DOverrider, with all the VSync override abilities of RadeonPro minus the instability.

I wonder if it's worth it to update the OP or not.
 

razu

Member
Depends how your input drivers work. If they poll at their own rate and return most recent results, then placebo. If they get current state, then it'll reduce latency.

And even then, lot of games will run physics at a fixed frequency, and interpolate for display. E.g. They will generate a new keyframe, which uses input, at 60fps, but then interpolate at any rate, 200fps, for example. In this case, placebo. However, you will PERCEIVE better control - this is an important point!

We've tested this in our vehicle simulations. Even 10fps input/physics with 60fps display was reasonably acceptable. Whereas 240fps input/physics with 30 fps display was awful.

The problem is it's a bit of a 'bench press' issue, where people want to lift the most, but in this case 'notice' the most.
 
Top Bottom