• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC VSync and Input Lag Discussion: Magic or Placebo?

Very good info in this thread! something everybody should read when they get in to pc-gaming.

But this does beg the question why can't devs still implement this correctly themselves? its 2015.... Having a choice in fiddling with settings is nice but the need to use all these external tools shouldn't be necessarily to get an acceptable result and turns less tech-savvy people off.
 

astraycat

Member
None that I know of. It's basically impossible to force correct triple buffering for DirectX9 titles in full-screen mode externally, otherwise I would have put such functionality in GeDoSaTo already.

The only way to make correct triple buffering happen reliably is borderless windowed mode with desktop composition.

What I found misleading about your earlier post is that you put it as "ultimately approaching a half-refresh advantage", which I read as getting a half refresh advantage at most. This is obviously not true, as the advantage can easily be a full refresh depending on the scenario.

You lay it out much better in your new post.

If only proper triple buffering was actually easy to do. I think my DX11.2 implementation is correct, but I've yet to rigorously test it....

And that's without proper prediction code! Triple buffering requires reliable prediction of GPU load, otherwise moving objects will jitter if you predict a frame ahead/behind when it will actually be flipped. It's not easy!

[I assume we're VSYNC'd]
 

Arkanius

Member
None that I know of. It's basically impossible to force correct triple buffering for DirectX9 titles in full-screen mode externally, otherwise I would have put such functionality in GeDoSaTo already.

The only way to make correct triple buffering happen reliably is borderless windowed mode with desktop composition.

What I found misleading about your earlier post is that you put it as "ultimately approaching a half-refresh advantage", which I read as getting a half refresh advantage at most. This is obviously not true, as the advantage can easily be a full refresh depending on the scenario.

You lay it out much better in your new post.

So the best and reliable way to have correct Vsync is Borderless Fullscreen and frame limiting to 60fps or whatever your refresh rate is?
 

daninthemix

Member
Do g-sync monitors play nicely with RTSS framerate caps? For instance, could I cap at 40fps and get a perfectly smooth 40fps with g-sync?
 

Durante

Member
So the best and reliable way to have correct Vsync is Borderless Fullscreen and frame limiting to 60fps or whatever your refresh rate is?
That's the best way to get minimum input lag with correct triple buffering while minimizing input lag.

Except if you have a game where you can maintain 2x (or more) your refresh rate in FPS, then you should set the frame limit to that.
The only games where I've ever felt this to be relevant are the Ys games, which have extremely immediate controls and low performance requirements. E.g. in Ys Origin I can maintain 240 FPS easily, and that feels a lot more responsive than a "perfect" 60 FPS cap.

Do g-sync monitors play nicely with RTSS framerate caps? For instance, could I cap at 40fps and get a perfectly smooth 40fps with g-sync?
When I tested my G-sync monitor it seemed that way, yes. You also get graceful degradation when you don't hit the frame cap as a bonus.
 

daninthemix

Member
When I tested my G-sync monitor it seemed that way, yes. You also get graceful degradation when you don't hit the frame cap as a bonus.

Then that really is a game-changer. Set the cap to whatever your system can handle. I still won't jump in until the screens are cheaper and include VA models, though.
 

Arulan

Member
When I tested my G-sync monitor it seemed that way, yes. You also get graceful degradation when you don't hit the frame cap as a bonus.

Which is perfect for Gsync, because from what I've read on the subject (I still haven't had the pleasure of using it in-person), if the rendering frame rate exceeds the Gsync refresh cap it adds (I believe a frame?) some input latency.

For cases where you do constantly exceed or match your refresh rate, wouldn't double-buffered Vsync + RTSS cap at refresh rate ultimately be preferable because you can eliminate judder? Unless the difference in input latency is valued more than removing judder I suppose.
 

riflen

Member
Then that really is a game-changer. Set the cap to whatever your system can handle. I still won't jump in until the screens are cheaper and include VA models, though.

Just so you're aware, you might be waiting a long time for prices to fall. Panels that can manage varying refresh rates well enough without artifacting, are going to be more expensive to produce and certify than panels that just need to work at 60Hz alone.

The more capable the panel, the more manufacturers will charge. Variable refresh will be a feature that attracts a price premium for some time.
 

Durante

Member
Just so you're aware, you might be waiting a long time for prices to fall. Panels that can manage varying refresh rates well enough without artifacting, are going to be more expensive to produce and certify than panels that just need to work at 60Hz alone.

The more capable the panel, the more manufacturers will charge. Variable refresh will be a feature that attracts a price premium for some time.
I'm totally ready to pay whatever they want for a high-res, VA, variable refresh panel.

I'd even switch to AMD if it's only adaptive sync.

For cases where you do constantly exceed or match your refresh rate, wouldn't double-buffered Vsync + RTSS cap at refresh rate ultimately be preferable because you can eliminate judder? Unless the difference in input latency is valued more than removing judder I suppose.
With a perfect multiple of your refresh rate as the framerate you wouldn't get any judder. What you might get in games with motion blur is reduced motion blur, but in most games with motion blur I can't do locked 120 FPS anyway :p
 
I'm totally ready to pay whatever they want for a high-res, VA, variable refresh panel.
Hmm.. does anyone beside Eizo even make a (non TV/overclocked) high refresh VA panel at this point?
So I'm guessing your best bet is whenever a new Eizo Foris comes out.
Currently it's "choose any two", unless you'd be fine with IPS (yea right).

With a perfect multiple of your refresh rate as the framerate you wouldn't get any judder. What you might get in games with motion blur is reduced motion blur, but in most games with motion blur I can't do locked 120 FPS anyway :p
In my opinion there isn't such a thing as good motion blur without gaze tracking except for slooow camera pans and/or where you can roughly guess it (e.g non-"shaky" movies)
 

daninthemix

Member
I'm totally ready to pay whatever they want for a high-res, VA, variable refresh panel.

I doubt I'll ever see my ideal one - I would want 27" 1080p VA G-sync. I don't want higher res because of the much higher GPU costs, and I like a big screen.

It's never going to happen :(
 

realzzz

Neo Member
Good thread !

Does anyone have the same problem in radeon pro, theres a notification sound that I hear in game even if I disabled notifications in the settings of radeon pro. Thanks in advance.
 

Durante

Member
Hmm.. does anyone beside Eizo even make a (non TV/overclocked) high refresh VA panel at this point?
I don't really see why it needs to be non-TV. Give me one of those 4k 32" TV VA panels with driver logic that actually supports 120 Hz input and doesn't take forever to process a frame and I'll be perfectly content.
 
I don't really see why it needs to be non-TV. Give me one of those 4k 32" TV VA panels with driver logic that actually supports 120 Hz input and doesn't take forever to process a frame and I'll be perfectly content.
The big issue currently is that neither HDMI nor DisplayPort 1.2 have enough bandwidth.
The Seiki SE39UY04 for example can accept 120Hz or 4k, but not both.

So not looking good unless e.g the upcoming new Seikis ship with DisplayPort 1.3 and less lag

(In case you're not already aware of that)
 

Durante

Member
I am aware. I also don't want a TV (because of input lag etc.).

I want someone to use one of those panels to build a monitor, with DP1.3.

Seiki is at least creating the Seiki Pro monitor line now, but not with >60 Hz.
 

Kezen

Banned
.
Having a choice in fiddling with settings is nice but the need to use all these external tools shouldn't be necessarily to get an acceptable result and turns less tech-savvy people off.

There are other platforms to enjoy for less tech savvy people. Leave PC gaming for those who can appreciate the wide range of options it offers.
 

Durante

Member
More importantly, I'd argue that none of this is necessary to get acceptable results. Most of what we are talking about here is solving concerns that many would classify as extremely minor or even unnoticeable.
 

Stitch

Gold Member
So what is the vsync in Dying Light? It allows for between 30-60 framerate, but it drops frames like a motherfucker. One scene I look at, vsync off, framerate is 50.
Same scene, in game vsync on, framerate is 35.

So this should not be double buffer since framerate is not 30, but it is severe as fuck.
And for example borderless full-screen in Ryse or Watchdogs etc did not cause a drop like this at all. But in DL, windowed incurs same performance drop as vsync. What's the deal here?

Essentially, I can play DL locked to 30,which sucks, or vsync off, which is much better, but tearing is annoying.

The Vsync in Dying light is buggy. That's one of the first thing I've noticed (The other thing were all the ugly post-processing effects). You either get 30 FPS or 60 FPS.

You can fix this by alt-tabbing out of the game and then alt-tabbing back into the game.
 

Kezen

Banned
The Vsync in Dying light is buggy. That's one of the first thing I've noticed (The other thing were all the ugly post-processing effects). You either get 30 FPS or 60 FPS.

You can fix this by alt-tabbing out of the game and then alt-tabbing back into the game.

That's not "buggy". That's double buffered vsync.
 
For those of you who have a g sync monitor, as someone who is interested in getting one, do you find the experience to be something immediately noticeable or impressive? I'd be pairing it with a 980.
 
None that I know of. It's basically impossible to force correct triple buffering for DirectX9 titles in full-screen mode externally, otherwise I would have put such functionality in GeDoSaTo already.

In that case, because I'm curious, do you know what RadeonPro and D3DOverrider are actually doing? Why do they seem to introduce input lag in some games but not others?

Also, just in general, it would be cool if someone could put together a list of games that do triple buffering properly. If nothing else, people will be able to see what it's like, and maybe more gamers will start calling for it.
 

jorimt

Member
@Arulan post #45:

Finally, what I've learned through experimentation (and lots of research) being confirmed by an actual person, you know, other than myself.

Good to know I'm not entirely alone in understanding this stuff, and also good to learn a few new things from your post.

I've been wasting far too much time in the Steam forums (*shudders*).
 

shockdude

Member
I've updated the OP a bit. Any thoughts? Any inaccuracies?

In that case, because I'm curious, do you know what RadeonPro and D3DOverrider are actually doing? Why do they seem to introduce input lag in some games but not others?
The input lag from injected triple-buffering might be related to this, but I could be wrong.
I put together models for this stuff a while ago (as an excel vba program, lol), results looked like so:

j1RySMH.png


Note that part of why the results are able to be so low is that is doesn't account for the time taken to output and display the frame, which for a given setup will tend to be a roughly constant addition.
I hope to try a blind test this weekend to verify this model.

Also, just in general, it would be cool if someone could put together a list of games that do triple buffering properly. If nothing else, people will be able to see what it's like, and maybe more gamers will start calling for it.
Yeah I don't think any game does triple-buffering "properly," unfortunately. Just run a game windowed and use an external frame limiter, or find a game with a built-in 60FPS limit.
Imo RadeonPro's "lock framerate to refreshrate" + VSync has very similar input lag as windowed @ 60FPS. This is based on some quick informal tests in Touhou 11, which has an in-engine 60FPS frame limiter. At that point it's a matter of whether you value windowed (alt-tabbing) or fullscreen (better performance, better frame timing).
Both are slightly inferior to fullscreen no-vsync but that's expected.
 

jorimt

Member
I am aware. I also don't want a TV (because of input lag etc.)

Ha, try playing on a 50" Panasonic ST60 plasma TV. The minimum input lag on this thing is 75ms, and that's with "Game Mode" on.

I do have a decent 24" monitor with a much more respectable 20ms of input lag, but the PQ on the Panasonic is so far beyond most edge-lit (worst back-light tech idea ever) LED displays, it's hard to play on anything else. It's size advantage over a monitor and the ability to play on a couch (yes mouse, and all) doesn't hurt either.

Is it optimal? Nope, but it's most certainly not "unplayable" either. And that's the great thing about PC gaming: choices.

More importantly, I'd argue that none of this is necessary to get acceptable results. Most of what we are talking about here is solving concerns that many would classify as extremely minor or even unnoticeable.

I agree, the difference, at worst, is a short bat of an eyelash vs. a long one.

That, and some seem to forget that input lag isn't just from a display, or from vsync or so on. It's layered on from multiple sources by the time it reaches the screen.

Also, what isn't often discussed is reaction time, which is going to vary a bit by person to person.

Let's say we get the input from our mouse down to the bare minimum of 1ms, we have a 1ms display, and we're running a game at a such a high refresh rate/framerate that we're getting far below 16.6ms render times. Seeing as the average human reaction time is 200ms, no matter how much we decrease the input lag, the average person is still going to (maybe not perceive, but) react at 200ms, and we can't prevent that. So, from where I see it, any game that has a total input latency below the average reaction time is entirely "playable."

When I do play the same game on a display with lower input lag than my TV, I don't find that I'm instantly doing better with the extra reaction time afforded to me, in fact, I find that I'm reacting too early, because that is what my brain learned to do with that extra delay.

Our brains do an amazing job of adapting, and I think that's more important than the slight (sometimes thought of as "HUGE") differences in input latency.

That said, I'm not suggesting we don't aim for lower latency, of course we should. Just trying to add my (admittedly unusual) perspective on it.
 

Durante

Member
The thing is, input lag will always be added on top of any human reaction time. So you never quite reach the point where it's irrelevant.

Personally, for twitch reflex based games, I play on a 120 Hz DLP projector with generally <=1 frame processing lag (and ~0 switching time since it's DLP). It's just 720p, but downsampling from 2560x1440 the IQ is quite acceptable, and there's just no alternative latency-wise which is 1080p :/
 

Arulan

Member
With a perfect multiple of your refresh rate as the framerate you wouldn't get any judder. What you might get in games with motion blur is reduced motion blur, but in most games with motion blur I can't do locked 120 FPS anyway :p

Hmm, I was under the impression that even if you maintained a consistent multiple you would still get judder with "real Triple-Buffered Vsync" similar to how even if you can maintain >60 fps consistently with "No Vsync" and cap it at 60, you still get judder (both aren't waiting for synchronization to swap to front buffer).

As for motion blur, I find it quite terrible regardless. :p

@Arulan post #45:

Finally, what I've learned through experimentation (and lots of research) being confirmed by an actual person, you know, other than myself.

Good to know I'm not entirely alone in understanding this stuff, and also good to learn a few new things from your post.

I've been wasting far too much time in the Steam forums (*shudders*).
That's really good info, thanks a lot!
Well thanks for this info, I did not know this.
Really wish I could get Dying Light working as I did Ryse etc :(

Glad I could help. I've spent some time trying to perfect my experience, and unfortunately there is very little information, much less organized on this subject. You can imagine my reaction when they announced G-sync. :)
 

riflen

Member
For those of you who have a g sync monitor, as someone who is interested in getting one, do you find the experience to be something immediately noticeable or impressive? I'd be pairing it with a 980.

The G-Sync monitors have two modes, meant for different situations. G-Sync mode makes the display slave to the GPU and only refreshes the display panel when the GPU has a new frame to output. This works between 30 and 143fps (or 30-59fps for some 4K G-Sync displays).

The effect of this mode is immediately noticeable to me and has many advantages over Vsync.
1. I do not have to worry about configuring the game to find the best vertical sync settings, or fiddling with 3rd-party tools. Just make sure the Nvidia control panel has G-Sync mode set for the game, then play.
2. I don't have to configure the game so that it always maintains a certain frame rate. Arbitrary frame rates are possible for all games and feel good to play at. There's a very consistent feel.
3. Most people aren't aware that VSync can cause stuttering in a game, even if you're generating enough frames to maintain display synchronisation. These kind of stutters are usually incorrectly attributed to some issue with the game or other performance problem. This stuttering is gone in G-Sync mode.

G-Sync mode isn't alchemy though. Frame rates in the 30s don't magically feel like 60fps and if you're like me, you'll see "ghosting" when there is high motion at the lower frame/refresh rates. This is understandable, as the panel is updating rather slowly in this scenario. Still, when the frame rate dips, I find it easier to "play through it" and not get pulled out of the action. G-Sync mode only works in full-screen mode. Personally, I now consider variable-refresh a must-have for a PC games monitor.

The second mode is Ultra Low Motion Blur. Most people will use this less often and it's designed to give excellent motion fidelity. To use this mode, you should enable Vsync and best results come if you maintain a fixed frame rate. I believe It works at 85, 100 or 120Hz.
I would say that while G-Sync mode has more utility overall, ULMB is much more impressive to see. Part of this can be attributed to the very high frame rates you'll have to use, but the clarity is startling.
You can make out texture detail during high motion that would usually be hidden in a smudge of motion blur as the panel updates. It has a hyper-real feel to it and actually helps you judge movements more accurately in a game. There is considerably more information for your brain to take in with high frame rates and ULMB. This mode is useful if you play games that are trivial to run or you play competitive twitch games at high frame rates. It's not possible to enable both G-Sync and ULMB modes simultaneously.
 

jorimt

Member
As for motion blur, I find it quite terrible regardless. :p

At 30 fps, it can help blend frames and diminish the perceived ghosting a bit, which, with most AAA games being ported from the 30 fps console versions, is probably why it's been in use so often lately.

And at 60+ fps, depending on it's implementation, motion blur (especially per-object) can act as sort of a temporal stabilizer in concert with in-game AA to reduce sub-pixel shimmering in motion.

But I agree, motion blur for motion blur's sake at 60 frames or above is basically useless.
 
The G-Sync monitors have two modes, meant for different situations. G-Sync mode makes the display slave to the GPU and only refreshes the display panel when the GPU has a new frame to output. This works between 30 and 143fps (or 30-59fps for some 4K G-Sync displays).

The effect of this mode is immediately noticeable to me and has many advantages over Vsync.
1. I do not have to worry about configuring the game to find the best vertical sync settings, or fiddling with 3rd-party tools. Just make sure the Nvidia control panel has G-Sync mode set for the game, then play.
2. I don't have to configure the game so that it always maintains a certain frame rate. Arbitrary frame rates are possible for all games and feel good to play at. There's a very consistent feel.
3. Most people aren't aware that VSync can cause stuttering in a game, even if you're generating enough frames to maintain display synchronisation. These kind of stutters are usually incorrectly attributed to some issue with the game or other performance problem. This stuttering is gone in G-Sync mode.

G-Sync mode isn't alchemy though. Frame rates in the 30s don't magically feel like 60fps and if you're like me, you'll see "ghosting" when there is high motion at the lower frame/refresh rates. This is understandable, as the panel is updating rather slowly in this scenario. Still, when the frame rate dips, I find it easier to "play through it" and not get pulled out of the action. G-Sync mode only works in full-screen mode. Personally, I now consider variable-refresh a must-have for a PC games monitor.

The second mode is Ultra Low Motion Blur. Most people will use this less often and it's designed to give excellent motion fidelity. To use this mode, you should enable Vsync and best results come if you maintain a fixed frame rate. I believe It works at 85, 100 or 120Hz.
I would say that while G-Sync mode has more utility overall, ULMB is much more impressive to see. Part of this can be attributed to the very high frame rates you'll have to use, but the clarity is startling.
You can make out texture detail during high motion that would usually be hidden in a smudge of motion blur as the panel updates. It has a hyper-real feel to it and actually helps you judge movements more accurately in a game. There is considerably more information for your brain to take in with high frame rates and ULMB. This mode is useful if you play games that are trivial to run or you play competitive twitch games at high frame rates. It's not possible to enable both G-Sync and ULMB modes simultaneously.

thanks for getting back to me. I appreciate the rundown.
 

shockdude

Member
I came up with a cheap cmd script to run two programs in a random order, and then print which program was run first. This can probably be used to do blind tests to compare two configurations for the same game, e.g. double-buffering vs triple-buffering.
It can probably be modified to write to a file so that you don't immediately know which was which.
Edit: Updated script in post 141.
 
D

Deleted member 245925

Unconfirmed Member
So the best and reliable way to have correct Vsync is Borderless Fullscreen and frame limiting to 60fps or whatever your refresh rate is?

That's the best way to get minimum input lag with correct triple buffering while minimizing input lag.

Is there an easy solution to both force borderless fullscreen (as many games unfortunately still don't have the option in the settings) and limit the frame rate as mentioned above? From what I read in this and the other thread, there are several tools for both. What's the most hassle-free and generally applicable way to achieve this?
 

jorimt

Member
Is there an easy solution to both force borderless fullscreen (as many games unfortunately still don't have the option in the settings) and limit the frame rate as mentioned above? From what I read in this and the other thread, there are several tools for both. What's the most hassle-free and generally applicable way to achieve this?

This program works well for games that can be put into windowed mode: http://westechsolutions.net/sites/WindowedBorderlessGaming/

With that set, you simply can install MSI Afterburner, and use the included RivaTuner Statistics Server to set a framelimit for the game's exe: http://event.msi.com/vga/afterburner/download.htm
 

PnCIa

Member
Ever since i discovered vsync and framerate limiting i cant live without it. Its funny that the build in limiter of the Nvidia driver essentially limits your fps to 59.something in d3d games and not to 58 when you set it up for 58.
Using RTSS you can get 58 but i always get the feeling that it judders a bit.

/edit: Limiting your FPS to something lower than your refresh rate (like 60hz -> 58FPS) always creates judder, right? Because you get duplicated frames...
 

Arkanius

Member
That's the best way to get minimum input lag with correct triple buffering while minimizing input lag.

Except if you have a game where you can maintain 2x (or more) your refresh rate in FPS, then you should set the frame limit to that.
The only games where I've ever felt this to be relevant are the Ys games, which have extremely immediate controls and low performance requirements. E.g. in Ys Origin I can maintain 240 FPS easily, and that feels a lot more responsive than a "perfect" 60 FPS cap.

When I tested my G-sync monitor it seemed that way, yes. You also get graceful degradation when you don't hit the frame cap as a bonus.

I can sustain a reliable 120 in Dota 2 for example, and I had it capped at 60fps.
Ill change the fps max to 120 now then. (I was doing it to conserve power)
 

shockdude

Member
Finished my cheap program comparison batch script.
First test I did with it was check if I could tell the input lag difference between VSync and NoVSync @ 60FPS, and I guessed correctly 29/30 times. Good enough.
Gonna have some fun times comparing other things.
Code:
@echo off
setlocal EnableDelayedExpansion

:: edit these to be the two programs you want to compare
set program1=SuperMeatBoy_DB.exe
set program2=SuperMeatBoy_TB.exe

set /p numtests=How many tests? 

for /l %%g in (1, 1, !numtests!) do (
	set /a test = !RANDOM! %% 2 + 1
	set myanswer=-1

	if !test! equ 1 (
		echo Running Program 1
		!program1!
		echo Running Program 2
		!program2!
	) else (
		echo Running Program 1
		!program2!
		echo Running Program 2
		!program1!
	)

	set /p myanswer=Which program was faster? Enter 1 or 2: 
	echo !test!, !myanswer! >> outdata.csv
)
.
 

TSM

Member
The G-Sync monitors have two modes, meant for different situations. G-Sync mode makes the display slave to the GPU and only refreshes the display panel when the GPU has a new frame to output. This works between 30 and 143fps (or 30-59fps for some 4K G-Sync displays).

This is incorrect. G-Sync works perfectly fine under 30 fps. I use it when watching NTSC (29.976) or PAL (25) video. It perfectly syncs the video to eliminate judder. G-Sync is fantastic with anything that requires really odd refresh rates like some MAME games or NTSC video. If you want to play a MAME game that requires 43.7856 hz then G-Sync will present that to you.

As for the topic, G-Sync pretty much renders the topic moot as long as the game enables exclusive full screen. Unfortunately for those games that do not allow exclusive full screen you end up right back in this same boat. There's no perfect solution at the moment.
 

shockdude

Member
Did some double-blind tests to compare double-buffering vs triple-buffering with Super Meat Boy using my script. Double-buffering & triple-buffering were both enforced using RadeonPro.
One thing I discovered during these tests is that VSync's input latency varied a lot every time the game was opened. Sometimes it felt like 100ms extra input lag, sometimes it felt like no lag at all.

DB vs TB: triple buffering felt faster for 18/30 trials. Cannot conclude with 90% confidence that TB is superior. Also input latency varied a lot.
DB vs TB after changing max pre-rendered frames to 1: triple buffering felt faster for 22/30 trials. Can conclude with 90% confidence that TB is superior - except that input latency still varied a lot so that conclusion is probably moot.

So I enabled "lock framerate to refresh rate" to minimize as much extra latency as possible.

DB vs TB, MPF=1, lock framerate: triple buffering felt faster for 15/30 trials. Lol. Also the input latency variation didn't go away, though it was much less than it was without "lock framerate" enabled.

So basically for me there isn't much input latency difference between DB and TB, in Super Meat Boy at least.
 

Durante

Member
I'm happy someone is going at this in a somewhat rigorous fashion.

Still, what we'd really need is a way to measure this stuff reliably and objectively. Something like the Oculus latency tester.
 

XBP

Member
Just a question, would radeonpro's vsync+lock frame rate to refresh rate be equal in input lag/performance to forcing vsync in Nividia's control panel + locking frame rate to refresh rate using MSI afterburner?

Plus from what I gather so far, as radeonpro's triple buffering option isn't 'proper', its pretty pointless to enable that alongside vsync?
 

riflen

Member
This is incorrect. G-Sync works perfectly fine under 30 fps. I use it when watching NTSC (29.976) or PAL (25) video. It perfectly syncs the video to eliminate judder. G-Sync is fantastic with anything that requires really odd refresh rates like some MAME games or NTSC video. If you want to play a MAME game that requires 43.7856 hz then G-Sync will present that to you.

As for the topic, G-Sync pretty much renders the topic moot as long as the game enables exclusive full screen. Unfortunately for those games that do not allow exclusive full screen you end up right back in this same boat. There's no perfect solution at the moment.

Interesting. Thanks for the information. This is off-topic, but I swear when I tested G-Sync last year with frame times > 33ms and found it didn't behave well at all. Frame rates beneath 30 felt awful and latency was high. This was in line with G-Sync reviews at the time that stated the display will pad the output with repeat frames if you're generating fewer than 30 fps.

I also tested video playback again after your post and it seems you can get good playback of 23.976 fps video in G-Sync mode. Again, I tested this last year and couldn't make it work well. I'm not sure that the display is refreshing at 24 Hz, or if it's padding to keep the refresh rate at 30 Hz. I found that there was occasional stuttering of 23.976 fps video using G-Sync mode at 120 Hz, but at 144 Hz playback was smooth. Granted I only had time for a short test today.
 

furfoots

Member
Is there anything besides RadeonPro that can force decent VSYNC in both 32/64 bit games on AMD, even CCC wont force VSYNC in 64 bit.

Good test I find is something like Diablo 3 which is extremely sensitive when it comes to VSYNC. The only good way I found to get butter smooth gameplay was with Radeonpro VSYNC on and pre rendering frames (flip queue) at 2.
 

Arulan

Member
I'm happy someone is going at this in a somewhat rigorous fashion.

Still, what we'd really need is a way to measure this stuff reliably and objectively. Something like the Oculus latency tester.

That, and use something like FCAT to see how different settings affect motion smoothness (frame times). Fraps, as I would imagine most software-based methods do, can only measure the frame times at a certain point within the output chain (I forget which step exactly), but it isn't reliable when comparing motion smoothness.

For instance, I tried to use Fraps about a year ago for this, but it's difficult to say whether any of this information is valid given the above. I was attempting to compare the difference in motion smoothness different CPU pre-rendered frames ahead settings would produce in conjunction with various Vsync settings. I'm basically just copying an old post I made, TB in this example means "common Triple-Buffered Vsync".

My specs:

i7-920@3.8Ghz
GTX 670 2GB OC'd
6GB RAM
OS & Game installed on SSD
120hz display

I'm using Left 4 Dead 2 in this example on the first part of the No Mercy level while playing solo. In-game visual settings are set at maximum, film grain is off, and I'm using Ambient Occlusion through Nvidia Inspector.

I list the settings that I use in each image. For Vsync this includes D3DOverrider DB and TB Vsync, In-game DB and TB Vsync, and No Vsync. I also compare settings of maximum pre-rendered frames (Nvidia Inspector) of default (Use the 3D application setting) and 1. When I use a frame cap (through RivaTuner Statistics Server) I specify it as rtcap120, otherwise I'm not using a frame cap.

comparison1s2rw0.png

comparison2idpor.png

comparison3s6p06.png

comparison40epul.png

comparison585phd.png
 

jorimt

Member
Simple question to something I've been meaning to understand for a while...

With games like Assassin's Creed IV: Black Flag and Dying Light, which don't have a native triple-buffered solution, I've been able to use the Windows + L key, and then log back in while a game is running to apparently force triple-buffered vsync.

At this point, the game is still in true/exclusive fullscreen (not borderless windowed), so I assume whatever has been forced is what Arulan referred to in his post as "common" triple-buffered vsync, not "real" (although I'd assume Aero has something to do with its triggering).

Some have claimed they are able to achieve this same trick in certain double-buffered-only games by alt + tabbing or alt + entering out and back in. So, question is, why/how is this happening, and if it works (and it is so easy to achieve, even on 64-bit games, which D3DOverrider can't even do), why isn't it being used as a method natively in-game?
 

Blizzard

Banned
Arulan said:
If your frame rate is consistently under your refresh rate then the "back buffers" won't fill up, which is what causes the additional input latency (that is filling them up causes increased input latency). High-refresh displays in particular have an advantage because it's much harder for your frame rate to surpass the refresh rate. As for the potential for perfect motion, if your frame rate consistently matches your refresh rate then your output of frame times will be consistent.
I'm confused by part of this.

If you have an 120 Hz (refresh rate) monitor and your framerate is much lower than 120 Hz, that will have lower input latency than otherwise? If you have a 60 Hz monitor and your framerate is much HIGHER than 60 Hz, that means you'll have HIGHER input latency?

Both of those examples are the opposite of what I would expect.
 
Top Bottom