• Register
  • TOS
  • Privacy
  • @NeoGAF
  • Like

ScepticMatt
Member
(10-30-2015, 02:44 PM)
ScepticMatt's Avatar
How to activate Just-in-Time VYSNC in Dota 2 ?
Experimental feature added as part of the Halloween Dota 2 update.
Enable VSync in the video options panel. Then open the console and set this variable:

Code:

r_experimental_lag_limiter 1
Valve dev post: http://dev.dota2.com/showthread.php?t=184108

When is Just-in-Time VYSNC beneficial?
When the frame rate is consistently higher than the monitor refresh rate.

What benefits does Just-in-Time VYSNC have ?
Lower input latency.

Are there other games using Just-in-Time VYSNC
A few, for example ezQuake.

How does Just-in-Time VYSNC work?
The rendering engine will attempt to schedule (i.e. predict) simulation and rendering to finish just before the start of a new screen refresh.
This is how I understand it (Correct me if I'm wrong):

(assuming input sampling happens just before simulation, i.e. no late frame scheduling)
Finaika
Banned
(10-30-2015, 02:45 PM)
Finaika's Avatar
Is this like adaptive vsync?
ScepticMatt
Member
(10-30-2015, 02:48 PM)
ScepticMatt's Avatar

Originally Posted by Finaika

Is this like adaptive vsync?

No. Adaptive v-sync just disables vysnc if the frame rate falls below the refresh rate, keeping input latency low but cause tearing.
OldAsUrSock
Member
(10-30-2015, 02:50 PM)
OldAsUrSock's Avatar
They need this for CSGO too.
Durante
Come on down to Durante's drive-through PC port fixes. 15 minutes or less. Yelp: ★★★★★

Qualifications: Fixed Souls, Deadly Premonition, Lightning Returns, Umihara Kawase, Symphonia, PhD, likes mimosas.
(10-30-2015, 02:52 PM)
Durante's Avatar
Interesting, I wrote about this some months ago on my blog.

Also made a (similar) picture:
Gbraga
Junior Member
(10-30-2015, 02:52 PM)
Gbraga's Avatar
How much lower? Really not a fan of using vsync for any competitive game.
Durante
Come on down to Durante's drive-through PC port fixes. 15 minutes or less. Yelp: ★★★★★

Qualifications: Fixed Souls, Deadly Premonition, Lightning Returns, Umihara Kawase, Symphonia, PhD, likes mimosas.
(10-30-2015, 02:55 PM)
Durante's Avatar

Originally Posted by Gbraga

How much lower? Really not a fan of using vsync for any competitive game.

It depends on how accurate the prediction is. If everything works perfectly, ideally it's pretty much as low as it gets (and in fact lower for parts of the screen than Vsync off, while also consuming a lot less energy).
OnionPowder
Member
(10-30-2015, 03:00 PM)
OnionPowder's Avatar

Originally Posted by Durante

Interesting, I wrote about this some months ago on my blog.

Also made a (similar) picture:

That's exactly what it reminded me of. Good to see it being put in place by a developer, only because GeDoSaTo is limited to DX9 (if I remember correctly)
Nzyme32
Member
(10-30-2015, 03:00 PM)
Nzyme32's Avatar

Originally Posted by Durante

It depends on how accurate the prediction is. If everything works perfectly, ideally it's pretty much as low as it gets (and in fact lower for parts of the screen than Vsync off, while also consuming a lot less energy).

Is there a reason why it isn't more commonplace then?
DieH@rd
Member
(10-30-2015, 03:09 PM)
DieH@rd's Avatar

Originally Posted by Finaika

Is this like adaptive vsync?

No.

This is more like ~ if we know how long the frame rendering will take, lets schedule that start of the rendering process [which includes taking the latest user's input] to "as late time as possible", so that when the frame rendering is finished we can immediately send it to the monitor.

The normal rendering way is, as soon the last frame is sent, render next one and then wait untill it is time to send it to the display. This increases latency because Input pooling and displaying the frame is stretched over the longer period of time.
Last edited by DieH@rd; 10-30-2015 at 03:12 PM.
tuxfool
Member
(10-30-2015, 03:10 PM)

Originally Posted by Nzyme32

Is there a reason why it isn't more commonplace then?

You have to systemically predict the rendering time of each frame. I imagine this is quite hard, and would break when you get sudden rendering time shifts and is safe only when you have a lot of spare power to never drop under the capped rate.

I imagine this could be done by profiling the game and getting some mathematical function based on what the frame is displaying. This should help prediction on sudden rendering time shifts, but everything would involve more work.
DieH@rd
Member
(10-30-2015, 03:12 PM)
DieH@rd's Avatar

Originally Posted by tuxfool

You have to systemically predict the rendering time of each frame. I imagine this is quite hard, and would break when you get sudden rendering time shifts.

Exactly. This is usable for games that have "reliable" render times no matter what is on the screen.
Durante
Come on down to Durante's drive-through PC port fixes. 15 minutes or less. Yelp: ★★★★★

Qualifications: Fixed Souls, Deadly Premonition, Lightning Returns, Umihara Kawase, Symphonia, PhD, likes mimosas.
(10-30-2015, 03:18 PM)
Durante's Avatar

Originally Posted by Nzyme32

Is there a reason why it isn't more commonplace then?

I can think of three reasons:
  • As others have said, it requires consistent frametimes or accurate prediction.
  • Additionally, it requires a somewhat involved, well-tested relatively low-level implementation. Getting it wrong is worse than not doing anything.
  • Finally, for the vast majority of games released you can be happy if they are decently technically competent, you can't really expect significant development resource investment in reducing latency by a few ms.

Originally Posted by OnionPowder

That's exactly what it reminded me of. Good to see it being put in place by a developer, only because GeDoSaTo is limited to DX9 (if I remember correctly)

In principle, the game itself can also do a much better job than any external tool.
Dictator93
Member
(10-30-2015, 03:21 PM)
Dictator93's Avatar

Originally Posted by DieH@rd

Exactly. This is usable for games that have "reliable" render times no matter what is on the screen.

Basically, most valve games :D

It is great to see them getting into the nitty gritty regarding rendering finally after years of DX9 dormancy and shader model 2.0 in the valve offices.

I am curious as to what other games this could be applied to, not many fit the profile of being above monitor refresh rate constantly and having predictable load on a wide range of hardware.
oRuin
Junior Member
(10-30-2015, 03:22 PM)
oRuin's Avatar
Nice. I'll have to try this out.
ScepticMatt
Member
(10-30-2015, 03:23 PM)
ScepticMatt's Avatar

Originally Posted by Gbraga

How much lower? Really not a fan of using vsync for any competitive game.

Up to 1 frame, usually less. Roughly the same as vysnc off but without tearing.

Originally Posted by Nzyme32

Is there a reason why it isn't more commonplace then?

  • It needs developers that have a good understanding of their engine performance characteristics in order to do have correct predictions.
  • It mostly benefits PC games that run at consistent high frame rates and need low input lag (e.g. LoL, Dota, CS:Go).
  • If console game developers had frame rate to spare, they would rather use it for more graphical effects or image quality rather than lag reduction.
  • Developers for more demanding games may not care enough for the few PC gamers that have a powerful enough single GPU system (AFR SLI adds lag anyway)
Felix Lighter
Member
(10-30-2015, 03:26 PM)
Felix Lighter's Avatar
It's very interesting but in the long run I hope it's completely unnecessary because dynamic refresh monitors are the norm.
Falk
dat puzzling face
(10-30-2015, 03:33 PM)
Falk's Avatar

Originally Posted by Nzyme32

Is there a reason why it isn't more commonplace then?

I don't mean to steer this discussion into a PC vs console thing, but low latency input is also far more noticable with a mouse vs thumbsticks, just from the basic fact that mouse position maps directly to angle (or coordinates for pointers), while thumbstick position maps directly to rate of change or dx/dy of angle (or coordinates for pointers)

A game has to be rather invested into a M&KB input scheme for this to really, really shine. (Don't get me wrong, lower input latency is everything for fighting games on gamepad, etc too)
Kumubou
Member
(10-30-2015, 03:43 PM)
Kumubou's Avatar

Originally Posted by Dictator93

I am curious as to what other games this could be applied to, not many fit the profile of being above monitor refresh rate constantly and having predictable load on a wide range of hardware.

I could see it being useful with fighting games, which generally have a very static render target (since there's usually no variance with the number of character models and relatively low variance with the environment and graphical FX), are designed to be run at or above the refresh rate (almost always 60hz) 100% of the time and it's also one of the few genres where getting that extra few ms back can matter, online or offline.

Now if any developer that works in that genre would ever do that is another question entirely.
Lostconfused
I can make you pick a fight
With someone twice your size
(10-30-2015, 03:51 PM)
Lostconfused's Avatar

Originally Posted by Felix Lighter

It's very interesting but in the long run I hope it's completely unnecessary because dynamic refresh monitors are the norm.

But what about dynamic refresh televisions, consoles will have a say in how developers spend their time and resources.

Originally Posted by Kumubou

Now if any developer that works in that genre would ever do that is another question entirely.

They don't have to do it. Epic just needs a good implementation in their engine.
Dictator93
Member
(10-30-2015, 03:55 PM)
Dictator93's Avatar

Originally Posted by Kumubou

Now if any developer that works in that genre would ever do that is another question entirely.

Given how good iron galaxy has been to KI on a technical level, they could be already doing it for all we know.

But yeah, good point about fighting games.
Firebrand
Member
(10-30-2015, 04:06 PM)
Firebrand's Avatar
Has the "regular" VSync latency improved in Dota 2 Reborn? Normally the added latency from Vsync isn't that bad for me, but in the Source engine for some reason the input lag becomes absolutely massive.
M3d10n
Member
(10-30-2015, 04:49 PM)
M3d10n's Avatar
This technique allows the player to view the previous frame for a few milliseconds before collecting input for the next one. The effective input lag can actually end up being lower than vsync off if you think about it.

Originally Posted by ScepticMatt

Up to 1 frame, usually less. Roughly the same as vysnc off but without tearing.

  • It needs developers that have a good understanding of their engine performance characteristics in order to do have correct predictions.
  • It mostly benefits PC games that run at consistent high frame rates and need low input lag (e.g. LoL, Dota, CS:Go).
  • If console game developers had frame rate to spare, they would rather use it for more graphical effects or image quality rather than lag reduction.
  • Developers for more demanding games may not care enough for the few PC gamers that have a powerful enough single GPU system (AFR SLI adds lag anyway)

It is viable on consoles, but only for games that aren't graphically demanding by design (something like WarioWare, for example).

It should also be useful for retro emulators.
Last edited by M3d10n; 10-30-2015 at 04:51 PM.
viveks86
Member
(10-30-2015, 04:52 PM)
viveks86's Avatar
Interesting. Hadn't even heard about this technique until now
ashecitism
Member
(10-30-2015, 04:52 PM)
ashecitism's Avatar

Originally Posted by Firebrand

Has the "regular" VSync latency improved in Dota 2 Reborn? Normally the added latency from Vsync isn't that bad for me, but in the Source engine for some reason the input lag becomes absolutely massive.

People were experiencing lag even without V-Sync when it launched lol.
Felix Lighter
Member
(10-30-2015, 04:53 PM)
Felix Lighter's Avatar

Originally Posted by Lostconfused

But what about dynamic refresh televisions, consoles will have a say in how developers spend their time and resources.

But this solution is even less practical on a console because the games have less margin for error when it comes to predicting frametimes. I'd hope someday dynamic refresh televisions exist as well.
HTupolev
Member
(10-30-2015, 05:42 PM)
HTupolev's Avatar
Now they need to combine gsync and just-in-time vsync to handle all cases optimally.
stuminus3
Never buying another games console. Ever.
(10-30-2015, 05:45 PM)
stuminus3's Avatar
Hooray!

ScepticMatt
Member
(10-31-2015, 01:51 PM)
ScepticMatt's Avatar
Doesn't seem to working correctly on my end yet.

vsync disabled:

double buffered vsync (fullscreen exclusive):

Windows aero tripple buffered vsync + rivatuner statistics server frame limiter (borderless windowed mode, in-game v-sync disabled):

experimental vsync:
Durante
Come on down to Durante's drive-through PC port fixes. 15 minutes or less. Yelp: ★★★★★

Qualifications: Fixed Souls, Deadly Premonition, Lightning Returns, Umihara Kawase, Symphonia, PhD, likes mimosas.
(10-31-2015, 01:56 PM)
Durante's Avatar

Originally Posted by Dictator93

Basically, most valve games :D

It is great to see them getting into the nitty gritty regarding rendering finally after years of DX9 dormancy and shader model 2.0 in the valve offices.

Yea, I wonder how much of this is bleeding through from working on VR.
Lostconfused
I can make you pick a fight
With someone twice your size
(10-31-2015, 02:02 PM)
Lostconfused's Avatar

Originally Posted by Durante

Yea, I wonder how much of this is bleeding through from working on VR.

Probably a lot.
jmga
Member
(10-31-2015, 02:10 PM)
This looks like a perfect technique for VR if well implemented.
Dictator93
Member
(10-31-2015, 02:13 PM)
Dictator93's Avatar

Originally Posted by Durante

Yea, I wonder how much of this is bleeding through from working on VR.

I can only imagine it is... either that or they hired someone new!

BTW, I saw this when poking around the .ini files for Bioshock Infite and it reminded me of this thread:

Code:

// NOTE: [OpportunisticAsyncLoading] settings are ignored on console
[OpportunisticAsyncLoading]
; Enables the game to spend any time left at the end of a frame processing GC requests and adding newly streamed objects to the world. Will cause framerate drops if actual framerate is higher than bOpportunisticAsyncLoadingAssumedFPSWhenVSynced! Best to leave set to FALSE unless you are running with vsync on and have bOpportunisticAsyncLoadingAssumedFPSWhenVSynced set to your refresh rate.
bOpportunisticAsyncLoadingEnabled=FALSE
bOpportunisticAsyncLoadingAssumedFPSWhenVSynced=120
; Milliseconds per frame the game is allowed to spend on the main thread serializing streaming data during gameplay
John Caboose
Member
(10-31-2015, 02:16 PM)
John Caboose's Avatar
That's really cool tech