• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Triple buffering: "Why we love it"

For whatever reason most gamers seem oblivious to the inherent benefits of triple buffering in games. It gives you the performance of standard double buffering but without the ugly (in my eyes game breaking) tearing that comes with it. Anandtech have just put up an article that explains the process and why you should all be such a fan of it, and have it enabled in most every PC game, to paraphrase them:

Anandtech said:
So there you have it. Triple buffering gives you all the benefits of double buffering with no vsync enabled in addition to all the benefits of enabling vsync. We get smooth full frames with no tearing. These frames are swapped to the front buffer only on refresh, but they have just as little input lag as double buffering with no vsync at the start of output to the monitor. Even though "performance" doesn't always get reported right with triple buffering, the graphics hardware is working just as hard as it does with double buffering and no vsync and the end user gets all the benefit with out the potential downside. Triple buffering does take up a handful of extra memory on the graphics hardware, but on modern hardware this is not a significant issue.


For anyone that doesn't like tearing, it will essentially give a significant framerate boost in every game you enable it in, for all practical purposes its a free GPU upgrade.


If you're at all interested in the way that games render, want to know why triple buffering rocks or are sceptical of the benefits then I highly recommend you read the article:

http://www.anandtech.com/video/showdoc.aspx?i=3591&p=1


--------------------------------------------------------------------------------------------------------

Forcing Triple buffering in D3D games.

Now, the article has one glaring omission, and that's advice as to how to enable triple buffering in games that don't use OpenGL (an increasingly tiny minority these days) or explicitly include an option to enable it ingame. See, although your Nvidia and ATI control panel include to enable triple buffering in games, this setting only applies to OpenGL games, even though this may not be particularly clear.

This is where D3DOverrider comes in, as it allows you to force vsync and triple buffering in any game that uses Direct3D, which is the vast majority of games on the market today. It uses close to zero resources, is easy to use with both global and per application profiles available and I consider it to be one of the most essential programs for any self respecting PC gamer.

The program is installed in a package along with Rivatuner, which is another essential program for PC hardware enthusiasts as it allows you to do all manner of tweaking, OCing and monitoring and is useful for all manner of tasks. Download it here:

http://www.guru3d.com/index.php?page=rivatuner

Now, since D3DOverrider is a separate program you will need to launch it separately, on first launch, simply set:


"Start with Windows" to on
"Detection level" to medium
"Force Triple Buffering" to on
"Force Vsync" to on


I recommend that you do not force Vsync through your drivers in addition and just leave the setting to the default "application controlled."

Now when you launch a D3D game you should here a standard Windows success "beep" to indicate that triple buffering has been forced. Now, if for whatever reason this causes problems with a particular program, simply click the "Add profile" cross in D3DOverrider, navigate to the relevant game executable, and create a new profile with it, where nothing is forced on.

You can now enjoy all your games with a perfect tear free image but without the undesirable framerate and input latency hit that comes with vsync. Enjoy! :D

--------------------------------------------------------------------------------------------------------

For reference here's some example images from the article that help illustrate what is happening with the three different rendering techniques.

Standard double buffering:

15intg.png



Double buffer Vsync:

2ef81v5.png




Triple buffering:

5btlde.png


Note: If you are a dual GPU SLI/Crossfire user you will not be able to use tripple buffering. Sorry, folks.
 

Sinatar

Official GAF Bottom Feeder
Triple Buffering has a high GPU memory cost associated with it, so sometimes it can actually kill your performance if the game you're playing is already working your card fully.
 

Nif

Member
Would D3DOverrider do more than ATI Catalyst Control Center? (Probably a stupid question) I still notice vsync problems from time to time even though I forced it on there.
 
Sinatar said:
Triple Buffering has a high GPU memory cost associated with it, so sometimes it can actually kill your performance if the game you're playing is already working your card fully.

With 1GB becoming standard on modern GPUs, it is a very, very small piece of the pie in the grand scheme of things. Yes, if the extra buffer causes your card to use more than its dedicated memory pool, it may not be worth it, but this will be a very rare example on modern cards. 512MB cards should be fine to enable tripple buffering as well, outside of Crysis and extreme resolutions perhaps.

This is the reason why its not widespread on the console side, any spare chunk of memory is precious when you're in a RAM starved environment like that. However, the hit is in the single digits at 720p, and personally I feel that is more than worth the tradeoff. RE5 is one example of a PS3 game confirmed to be using triple buffering and it has some very high resolution textures. Its also believed this is what Naughty Dog is doing with Uncharted 2, and that game has the best textures on consoles, so if this is indeed the case it proves that enabling triple buffering on consoles doesn't have to mean compromising your texture budget to any noticeable degree.

I hate, hate, hate tearing, so I would like to hope more console developers (at least on the PS3 side where implementation is straight forward) give it a go.
 

Zzoram

Member
I have always enabled Triple Buffering, ever since I had a GeForce 6800 256mb (1440x900) and I still do with my HD4870 512mb (1680x1050).
 
beermonkey@tehbias said:
No way. Maybe in a year or two. Lots of 1GB cards out there but it is far from standard.

Well 512MB cards certainly are, and that's a very big memory buffer, more than enough for tripple buffering in most cases, the memory hit really isn't that big. You can pick up super fast 1GB (or 896MB) cards for $150 today, so it will soon be very commonplace.


Nif said:
Would D3DOverrider do more than ATI Catalyst Control Center? (Probably a stupid question) I still notice vsync problems from time to time even though I forced it on there.

ATI CCC triple buffering settings don't apply to D3D games at all and I've always found it much more "forceful" so to speak than any driver settings. So yes, it will often help you out where CCC can't.


Nikorasu said:
Problem for us SLI/crossfire users is that only single and triple-gpu setups support triple buffering.

Another one of the reasons why I'm just not a fan of multi GPU rigs, its such a broken solution in my eyes but I won't go into that today. Google AFR and micro stutter if you want to do some reading on the subject.
 
Tearing is a problem I have not in video games on my PC but when watching videos, especially on Netflix Watch it now. Would enabling triple buffering on my video card fix this?

It is a 1GB Nvidia card.
 

bee

Member
makes fps games practically unplayable for me but then again so does normal vsync, other gametypes are generally fine though
 
BlackNMild2k1 said:
Tearing is a problem I have not in video games on my PC but when watching videos, especially on Netflix Watch it now. Would enabling triple buffering on my video card fix this?

It is a 1GB Nvidia card.

It only applies to Direct3D applications, and since Netflix likely isn't using a player that is using Direct3D to display then it likely will not apply to it. However, its worth a try.

The only video player that I know of that has excellent Direct3D support to remove tearing is MPC:HC:

http://mpc-hc.sourceforge.net/

Not only that but it will use your GPU to decode a host of codecs leaving your CPU idle. Its the best media player on the market as far as I'm concerned. Instructions on enabling GPU decoding and the Direct3D mode to remove tearing are featured on the site are pretty easy to follow.


bee said:
makes fps games practically unplayable for me but then again so does normal vsync, other gametypes are generally fine though

Its all down to taste really, the problem is most don't know the option is there, nor what it does and I can guarantee most would see some benefit from enabling it in many games. I'm not uber sensitive to input lag, but I can spot tearing a mile off, so I have it forced globally in nigh on every game I play.
 

loganclaws

Plane Escape Torment
"Start with Windows" to on
"Detection level" to medium
"Force Triple Buffering" to on
"Force Vsync" to on

I recommend that you do not force Vsync through your drivers in addition and just leave the setting to the default "application controlled."

Now when you launch a D3D game you should here a standard Windows success "beep" to indicate that triple buffering has been forced. Now, if for whatever reason this causes problems with a particular program, simply click the "Add profile" cross in D3DOverrider, navigate to the relevant game executable, and create a new profile with it, where nothing is forced on.

You can now enjoy all your games with a perfect tear free image but without the undesirable framerate and input latency hit that comes with vsync. Enjoy!

Up above you are saying to force VSync on in the program's settings, then you say that this should let the player enjoy a tear free image without the undersirable framerate and input latency that comes with VSync... I'm confused. Can you elaborate?
 

Grayman

Member
I should further investigate this. Can anything let you view the backbuffer "framerate" while triple buffering is active?
 
loganclaws said:
Up above you are saying to force VSync on in the program's settings, then you say that this should let the player enjoy a tear free image without the undersirable framerate and input latency that comes with VSync... I'm confused. Can you elaborate?

Its because you're enabling "tripple buffer vsync", the two settings aren't separate, they work together so to speak to give the desired effect described.
 
Heh, brain_stew is on a one man mission to force MS and Sony to start next gen early by pointing out how much better PC graphics are and how reasonably priced PC gaming is.

You sly dog you. :lol
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
I was aware that triple buffer was superior, however, can you enable it in all games?

LiquidMetal14 said:
Didn't we just have a thread on this'?
We had a thread of people complaining on V-Sync on consoles.

Nikorasu said:
Problem for us SLI/crossfire users is that only single and triple-gpu setups support triple buffering.
I forgot. No wonder I cannot enable it.
 
Enabling Triple Buffering in ZSnes caused so much input lag I couldn't tolerate it.

Sorry but anything that buffers frames before displaying them gets a massive thumbs down from me.

Most people here couldn't tell about the input lag in most next-gen games (PDZ, Killzone 2, etc.) or even on their shitty LCD TVs, so I'm not expecting almost anyone to sympathize with me, but it seriously annoys me to no fucking end. I can't even stand playing SNES games on the Virtual Console on my friend's LCD TV, even with Game mode on. The input lag is just so bad compared to a good old CRT.

So yeah, fuck off with your triple buffering. The faster drawn frames get displayed, the better.

Prime example: Super Mario Galaxy. No buffering, no tearing, 60fps, no input lag.

Developers should strive to reach a perfect 60fps instead of implementing shitty buffering techniques to make up for their fucking incompetence.
 

Fox the Sly

Member
AtomicShroom said:
Enabling Triple Buffering in ZSnes caused so much input lag I couldn't tolerate it.

Sorry but anything that buffers frames before displaying them gets a massive thumbs down from me.

Most people here couldn't tell about the input lag in most next-gen games (PDZ, Killzone 2, etc.) or even on their shitty LCD TVs, so I'm not expecting almost anyone to sympathize with me, but it seriously annoys me to no fucking end. I can't even stand playing SNES games on the Virtual Console on my friend's LCD TV, even with Game mode on. The input lag is just so bad compared to a good old CRT.

So yeah, fuck off with your triple buffering. The faster drawn frames get displayed, the better.

Prime example: Super Mario Galaxy. No buffering, no tearing, 60fps, no input lag.

Developers should strive to reach a perfect 60fps instead of implementing shitty buffering techniques to make up for their fucking incompetence.


Damn son, ain't that a little harsh? He isn't forcing it down anyone's throat.
 
Fox the Sly said:
Damn son, ain't that a little harsh? He isn't forcing it down anyone's throat.

His topic title implies that we all love it. I don't. And I certainly wouldn't want devs to start getting ideas from this thread and forcing this shit in their games, especially on consoles where it probably couldn't be disabled. Games have enough input lag as it is!
 

diddlyD

Banned
the solution for tearing is not triple buffering. it's game developers properly budgeting their engine load early in development and sticking to it. easier said than done, most devs have a "we'll fix performance later" attitude.
 
AtomicShroom said:
His topic title implies that we all love it. I don't. And I certainly wouldn't want devs to start getting ideas from this thread and forcing this shit in their games, especially on consoles where it probably couldn't be disabled. Games have enough input lag as it is!

Even you said though, most people don't notice it. Why should devs limit themselves if its not an issue for 95%+ players? For those who it bothers, there is the PC and its customizable settings.
 
brain_stew said:
Its because you're enabling "tripple buffer vsync", the two settings aren't separate, they work together so to speak to give the desired effect described.

But wasn't the point of that article demonstrating that Vsync isn't needed with triple buffering? It sounded like they were only talking about triple buffering when talking about how good it was, not when it's used in conjunction with Vsync.
 

epmode

Member
I just installed this. Huge difference in some games OMG. And I didn't notice a huge input hit either. Tested with HL2 and Bioshock.

That said, I did notice my GPU fan working a lot harder when the framerate improvement was most significant. I suppose that's normal? I'm have a 9800 GTX+ (1GB) and I'm running at some ridiculously high resolutions.
 

Rur0ni

Member
Gonna try this when I get the chance. I've tried V-Sync (without triple buffering since D3D doesn't support, as explained), and it adds a good bit of input lag, which for someone like me is unacceptable. Despite that it makes things milky smooth. Somewhat negated (the input lag) by reducing the frames drawn ahead or whatnot in the driver. nVidia default is 3, so I set to 1. Mouse feels snappier but still slightly behind (syrupy feel).

Now if the D3D Override with vsync+triple buffer eliminates the input lag, or reduces it further, I may switch to that. :)

120hz LCDs are on the way too...
 

Sharp

Member
Sinatar said:
Triple Buffering has a high GPU memory cost associated with it, so sometimes it can actually kill your performance if the game you're playing is already working your card fully.
True, but in virtually every other case it is a benefit.
 

TheExodu5

Banned
Their "lack of input lag" line is BS. There is definite input lag with triple buffering. Imperceptible to most users maybe, but I can easily feel it.

I'll enable it in single player games when I want to avoid tearing and get the best quality possible. I'll disable it when I want the quickest response in my multiplayer shooters.

edit: you'd think it's possible that I've been running double buffering + vsync all this time...but my framerate clearly can fluctuate in between 30-60fps, so triple buffering must be enabled. I cannot turn it off either...turning it to off does nothing. brain_stew has reported that the NVidia setting only effects OpenGL games.
 

TheExodu5

Banned
epmode said:
The only times I really notice 1-3 frames of input lag are in music games and old-school 2D platformers.

I'll notice it in music games, games like Megaman it's very noticeable (all versions of MM9 have annoying input lag), and especially shooters if I'm controlling with a mouse.

edit: know what's weird? World of Warcraft feels completely identical, whether triple buffering is enabled or disabled in game. My framerate can still go from 30-60. Anyways, the weird thing is, it has a graphical option called "reduce input lag", and it completely eliminates input lag, even with v-sync. Why don't other games have this option!? What the hell does it do?
 

maus

Member
TheExodu5 said:
Their "lack of input lag" line is BS. There is definite input lag with triple buffering. Imperceptible to most users maybe, but I can easily feel it.

I'll enable it in single player games when I want to avoid tearing and get the best quality possible. I'll disable it when I want the quickest response in my multiplayer shooters.
yup

only game that has really benifited from forced triple buffering for me with was Dreamfall; which tore like crazy and chugged with vsync on.

there is a definate mouse lag with vsync/triple buffering in certain games. what all games need is an fps cap, because tearing at 60 fps and below isn't very noticable.

the one game i've been struggling with lately is the Duke Nukem 3D High Resolution mod. horrible mouse lag with vsync/triple buffering, but horrible tearing without it. oh well
 

Nikorasu

Member
It's weird how vsync really varies from game to game for me. I always enable it in SP, sometimes MP too in certain titles. Some games like Crysis, Quake wars or FEAR1/2 I can have vsync on in all those games and the input lag is more or less negligible, even when getting solid 60fps, but others like...well, pretty much any source engine game, the added lag makes it far more difficult on the MP side.

I can't use triple buffering due to my dual gpu setup, but my machine is fast enough that I'm getting a pretty solid 60fps on everything with regular vsync anyway. With the odd multiplayer exception, I'll take a smooth and solid image over ultra responsive controls any day.
 

Truespeed

Member
AtomicShroom said:
His topic title implies that we all love it. I don't. And I certainly wouldn't want devs to start getting ideas from this thread and forcing this shit in their games, especially on consoles where it probably couldn't be disabled. Games have enough input lag as it is!

Are you turning on triple buffering using the options in ZNES or are you using D3DOverrider?
 
AtomicShroom said:
His topic title implies that we all love it. I don't. And I certainly wouldn't want devs to start getting ideas from this thread and forcing this shit in their games, especially on consoles where it probably couldn't be disabled. Games have enough input lag as it is!

It was the topic of the article linked and I really don't think you're grasping the situation at hand here. All modern 3D games will be using more than one framebuffer and the point is that without tripple buffering your alternatives are either a broken image through tearing or much worse input lag and performance through standard double buffer v-sync.

This isn't about 2D games, and its less about console games. As far as I'm concerned any game with any significant amount of tearing is borderline unplayable and broken, so you're left with two options, double buffer v-sync or triple buffer v-sync. One suffers from a significantly larger amount of lag and performance degradation, whilst the latter does neither to any real degree at the small cost of about 15-25MB of video RAM at PC resolutions.

If you're dealing with closed hardware and can make absolutely sure that your engine will run at 60fps all the time, then fine, no need to deal with this, but I honestly can't remember the last modern 3D game I played that had a 100% locked 60fps framerate. In those situations tripple buffer v-sync is in my (and Anandtech's) opinion, the superior option. A tear fest is never a solution, what good is extra visual data if its broken and glitchy?

For whatever reason most PC gamers aren't aware of the benefits of triple buffering and even less know how to force it in all games, so yes, just like Anadtech, I feel its something gamer's should be educated on because it can have very large real world benefits.

If input lag is a problem for you and you don't mind tearing, then go ahead stick with standard double buffering, northing's stopping you but at least now you have the information to make an informed choice.

Oh, and fwiw, the maximum of a dozen or so ms of input lag that can be introduced through vertical sync (in any of its forms) is not responsible for the 100ms+ inherent in many modern console games, there's other things at play here, and triple buffering is not what's to blame.


diddlyD said:
the solution for tearing is not triple buffering. it's game developers properly budgeting their engine load early in development and sticking to it. easier said than done, most devs have a "we'll fix performance later" attitude.

You can't do this with an open platform, like the PC, and creating a modern 3D game where the engine load is balanced throughout multiple dynamic scenarios is much more difficult than you're making out. Gamer's will always find a way to "break" your game and overload your engine.
 
Testing it out with a game (Supreme Commander demo) and it did that little Windows jingle thing prior to starting it up indicating that D3DOverrider was working, but I'm getting no such indication when starting up Crysis from steam. Is it because it is openGL?
 
TheExodu5 said:
Their "lack of input lag" line is BS. There is definite input lag with triple buffering. Imperceptible to most users maybe, but I can easily feel it.

I'll enable it in single player games when I want to avoid tearing and get the best quality possible. I'll disable it when I want the quickest response in my multiplayer shooters.

edit: you'd think it's possible that I've been running double buffering + vsync all this time...but my framerate clearly can fluctuate in between 30-60fps, so triple buffering must be enabled. I cannot turn it off either...turning it to off does nothing. brain_stew has reported that the NVidia setting only effects OpenGL games.

You're reading the FRAPS indicator wrong there I believe, as its a measure over a set period of time, not an indication of the frametime of the last rendered frame. So yes, it could hover between 30fps and 60fps even with standard double buffer v-sync. You can get fraps to report frame times and if you're not using an ingame triple buffering setting or an openGL game then your frametimes will only ever be ~16ms and ~32ms iirc.

You may be associating the lag with triple buffering when infact its the lag from standard double buffer v-sync you're experiencing. Triple buffering will reduce lag compared to the normal v-sync setting (to a level very similar to normal non v-sync double buffering).

maus said:
wait, what

pretty sure triple buffering is supposed to be enabled congruently with vsync in order to reap its benefits.


Indeed it does, but I'm telling him to not enable it twice. If you're forcing it globally through D3DOverrider you do not want to force it ingame as well to avoid any potential incompatability problems. Force both v-sync and triple buffering from the same place (i.e. D3DOverrider is what I'm saying to do).

It'd probably work just fine, but its simpler just to use one setting to control it, to avoid any potential conflicts.
 
brain_stew said:
Indeed it does, but I'm telling him to not enable it twice. If you're forcing it globally through D3DOverrider you do not want to force it ingame as well to avoid any potential incompatability problems. Force both v-sync and triple buffering from the same place (i.e. D3DOverrider is what I'm saying to do).

It'd probably work just fine, but its simpler just to use one setting to control it, to avoid any potential conflicts.

So disregarding Crysis for now (since I couldn't get the little windows jingle to go off when running it) I tried TF2 and I heard the jingle there... so I went into advanced video options and disabled vsync, but I'm seeing significant tearing as if D3DOverrider vsync isn't applying.
 
Top Bottom