That's really good info, thanks a lot!I hope I didn't forget anything.
That's really good info, thanks a lot!I hope I didn't forget anything.
Well for one - VSync doesn't add extra buffers over "No-VSync" - it just inserts wait-times to synchronize with VBlank.shockdude said:Magic or Placebo? And other input lag questions.
Does that refer to aero or the classic shell?The former is achievable by using windowed mode in Windows.
So what is the vsync in Dying Light? It allows for between 30-60 framerate, but it drops frames like a motherfucker. One scene I look at, vsync off, framerate is 50.
Same scene, in game vsync on, framerate is 35.
So this should not be double buffer since framerate is not 30, but it is severe as fuck.
And for example borderless full-screen in Ryse or Watchdogs etc did not cause a drop like this at all. But in DL, windowed incurs same performance drop as vsync. What's the deal here?
Essentially, I can play DL locked to 30,which sucks, or vsync off, which is much better, but tearing is annoying.
Does that refer to aero or the classic shell?
I'm clearly not sensitive to this at all. At least right now. I don't think I've ever noticed input lag. And I play games like Battlefield and racing sims with vsync on quite happily. I certainly don't play or drive any better with it off.
I'm fairly certain you need to use Aero.
I agree. I play with Vsync on in BF4, and I don't notice a difference in input lag. With Vsync off, even if framerates are above 60 it just doesn't feel as smooth, like there is a frame pacing issue. Vsync on fixes the issue.
I basically always play with vsync on, unless I am getting less than 60fps and I feel that vsync is actually taking away a few frames, then I wlll disable it.
Do you use a 60Hz display?
Same here. I absolutely hate input lag so I've had to settle with screen tearing all these years. Can't wait for something like Gsync to be the standard.
G-Sync probably won't be a standard but adaptive sync via display port 1.2a (and onwards) will most likely be, especially judging from the large support out of the gate.
Yes I do.
Just a little information nugget I thought I'd drop off.
On Windows 7/8, if you play in windowed mode, you cannot control any VSync options manually, it is determined by your Aero enable/disable state.
Windows Aero ON = The entire desktop (including any windows) are VSync'd
Windows Aero OFF = The entire desktop (including any windows) are NOT VSync'd, you should notice tearing while dragging windows.
Is there any chance of someone making labeled diagrams that show the frame buffers, render times, etc. for each mode?
I'm pretty familiar with v-sync and rendering, work as a developer, etc.; I'm just having a hard time visualizing/connecting each approach and how they relate to the input lag. I've always found diagrams and graphics great for understanding stuff like rendering pipelines.
I just want there to be no tearing, and no input lag, as long as I can guarantee over 60fps. Why is this so hard to achieve, or even to understand what all the different options do, on PC, when my Nintendo consoles seem to manage this fine? It's putting me off playing on PC, and when I do play, I use a controller so that I can't 'feel' the latency as much as I could with a mouse.
Wtf! we probably gonna wait another year for adapative sync(AMD freesync)?
Edit: lol the article is from may 2014, my bad
It's not particularly hard, just follow the instructions on the first page. Matter of 5 minutes at most.I just want there to be no tearing, and no input lag, as long as I can guarantee over 60fps. Why is this so hard to achieve, or even to understand what all the different options do, on PC, when my Nintendo consoles seem to manage this fine? It's putting me off playing on PC, and when I do play, I use a controller so that I can't 'feel' the latency as much as I could with a mouse.
I just want there to be no tearing, and no input lag, as long as I can guarantee over 60fps. Why is this so hard to achieve, or even to understand what all the different options do, on PC, when my Nintendo consoles seem to manage this fine? It's putting me off playing on PC, and when I do play, I use a controller so that I can't 'feel' the latency as much as I could with a mouse.
I just want there to be no tearing, and no input lag, as long as I can guarantee over 60fps. Why is this so hard to achieve, or even to understand what all the different options do, on PC, when my Nintendo consoles seem to manage this fine? It's putting me off playing on PC, and when I do play, I use a controller so that I can't 'feel' the latency as much as I could with a mouse.
Typical gaming latency values are lot higher than 'imperceptible'. Most people just don't know any better, or they use a latency insensitive controls, so they don't care.No input lag is an impossibility on all platforms. The only question is whether your chain of devices, from controller, through computer/console, to display, together account for a low enough total latency that you're unable to perceive it.
Experiments conducted at NASA Ames Research Center [2, 7, 8, 15] reported atency thresholds for a judgment of whether scenes presented in HMDs are the same or different from a minimal latency reference scene.
Subjects viewed the scene while rotating their heads back and forth. These studies found that individual subjects’ point of subjective equality (PSE—the amount of scene motion at which the subject is equally likely to judge a stimulus to be different from one or more reference reference stimuli) vary considerably (in the 0 to 80 ms range) depending on factors such as different xperimental conditions, bias, type of head movement, and individual differences. They found just-noticeable differences (JND—the stimulus required to increase or decrease the detection rate by 25% from a PSE with a detection rate of 50%) to be in the 5 to 20 ms range. They also found that subjects are more sensitive to latency during the phase of sinusoidal head rotation when direction of head rotation reverses (when scene velocity due to latency peaks) than the middle of head turns (when scene velocity due to latency
is smaller) [1].
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4811025Conclusion: The latency JND mean of 16.6 ms and minimum of 3.2 ms over 60 JND values suggest that end-to-end system latency in the 5 ms range is sufficiently low to be imperceptible in HMDs.
Human sensory systems can detect very small relative delays in parts of the visual or, especially, audio fields, but when absolute delays are below approximately 20 milliseconds they are generally imperceptible. Interactive 3D systems today typically have latencies that are several times that figure, but alternate configurations of the same hardware components can allow that target to be reached.
I can't say anything specific about Dying Light as I have not played it but I will address the following misconception:
Double-buffered Vsync DOES NOT mean your frame rate can only be 60, 30, 20, etc. It means your frame times can only be 16.7ms, 33.3ms, 50ms, etc. FPS is Frames Per Second, an average, and therefore the FPS shown when using double-buffered Vsync is just the average of these frame times (1/n) over the course of one second. However, if you maintain a frame time of over 16.7ms consistently over a period of time (and under 50ms for this example) then you will see 30 FPS. "Common Triple-Buffering" works similarly, as in that it only outputs frame times of 16.7ms, 33.3ms, 50ms, etc. (for a 60Hz display of course) and the FPS you're seeing is just the average of all these frames.
A side-note: One of the great things about a 120Hz monitor is that you have an additional step between 16.7ms and 33.3ms, which is 25ms (40 fps). This gives you an additional option if you're aiming for perfect motion (1/3 refresh standard Vsync + RTSS cap at 40) without having to go as low as 30.
I just want there to be no tearing, and no input lag, as long as I can guarantee over 60fps. Why is this so hard to achieve, or even to understand what all the different options do, on PC, when my Nintendo consoles seem to manage this fine? It's putting me off playing on PC, and when I do play, I use a controller so that I can't 'feel' the latency as much as I could with a mouse.
I think this would be worth a shot, at least until adaptive sync monitors become mainstream.TL;DR: I use RadeonPro’s “Lock framerate to refresh rate” option to eliminate input lag with VSync+Triple Buffering. Perfect 60FPS with almost no input lag feels amazing.
What I suspect external framerate limiting does in some games is delay the point in time where they sample input until just before they start rendering the frame (resulting in frames always having the impact of the most recent input sampling built-in).
Adaptive V-sync does pretty muc hexactly what it promises, V-sync above [Hz] FPS, tearing below that. However, it doesn't address the pacing and buffering issues sometimes solved by external frame limiting.What's the perception of Nvidia's adaptive vsync option?
Typical gaming latency values are lot higher than 'imperceptible'. Most people just don't know any better, or they use a latency insensitive controls, so they don't care.
If you're using "No Vsync" capping your refresh rate isn't usually worth it unless the increased frame rate is causing your CPU to stress itself out too much causing inconsistency. If you do cap it, there are certain values to aim for and avoid. Too close to the refresh rate (and 2x, 3x, etc.) /harmonic frequencies will cause it to tear more violently. If you can find the right value (of which I don't remember) you can move the tear line very close to off-screen.
No, they use the "common" version.Some more questions:
Do external applications (D3DOverrider, RadeonPro, etc.) implement triple buffering "correctly"?
Yup. I found this out when fucking around with Mercenary Kings. The game stutters like crazy with Aero on, and tears like crazy with Aero off. Really fucking stupid.
Yes, that's mostly it. You can also use D3DOverrider for DB Vsync for slightly lower input latency as a trade off to slightly less consistent motion too.So if I'm reading Arulan's post correctly, is it generally best to turn off vsync in game and let the NVidia drivers do the work + the FPS cap with RivaStatistics Server?
Also quick question here, under vsync the NVidia drivers give me an "On" option, as well as "On (smooth)". Whats the difference? Is it SLI related?
Enable RadeonPro's "Lock framerate up to monitor's refresh rate" setting, enable VSync, and enjoy.So, which tools ARE doing triple buffering correctly? Do Radeon-Pro and D3D Overrider do it? I definitely feel as though both those tools have introduced input lag when I turned on triple buffering, but I can't figure out if that's my imagination, the game, or an incorrect setting.
Because if by "input lag", we mean the duration between user action and a displayed frame responding to that user action, the refresh itself is a limiting factor for vsync'd cases. For instance, an input made in the instance after a refresh occurs will always take at least 1 full refresh before being displayed in a vsync'd case, no matter how fast the GPU is spitting out frames.Why only a half-refresh advantage? With, say, 600 FPS triple-buffered at 60 Hz you should at most get ~3ms of lag, which is much less than 16.6/2.
Am I reading this chart correctly?
Yes.Am I reading this chart correctly?
None that I know of. It's basically impossible to force correct triple buffering for DirectX9 titles in full-screen mode externally, otherwise I would have put such functionality in GeDoSaTo already.So, which tools ARE doing triple buffering correctly?
What I found misleading about your earlier post is that you put it as "ultimately approaching a half-refresh advantage", which I read as getting a half refresh advantage at most. This is obviously not true, as the advantage can easily be a full refresh depending on the scenario.Taking that factor into account, my post was wrong about the exact discrepancy, though. "Half a refresh" is the average input lag if you're triple-buffered with infinitely-fast frame processing, with the distribution evenly spread from 0 refreshes to 1 refresh. Whereas the double-buffered case would have 1.5 refreshes as the average, ranging from 1 refresh to 2 refreshes. So the discrepency in typical input lag would actually be 1 refresh in favour of triple-buffering in cases where the GPU is running at cartoonishly high framerates.