• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Triple buffering: "Why we love it"

TheExodu5 said:
I'll notice it in music games, games like Megaman it's very noticeable (all versions of MM9 have annoying input lag), and especially shooters if I'm controlling with a mouse.

edit: know what's weird? World of Warcraft feels completely identical, whether triple buffering is enabled or disabled in game. My framerate can still go from 30-60. Anyways, the weird thing is, it has a graphical option called "reduce input lag", and it completely eliminates input lag, even with v-sync. Why don't other games have this option!? What the hell does it do?

Whilst v-sync and the like can contribute to input lag, the biggest problems almost always comes from poorly coded games. When a modern LCD is contributing 50ms of lag, and a game's engine another 100ms, you're really not going to notice an extra 16ms on top of that and you shouldn't get close to that 16ms figure with triple buffering anyway.
 

maus

Member
Truespeed said:
In the case of DirectX, the answer is Yes.
yes as in you're supposed to disable vsync and enable triple buffering?

as far as i know, triple buffering merely allows vsync to operate at a larger variable of framerate, so fps drops aren't as noticable. wikipedia agrees with me.

the OP seems to be talking jargon for the most part.
 
MisterAnderson said:
So disregarding Crysis for now (since I couldn't get the little windows jingle to go off when running it) I tried TF2 and I heard the jingle there... so I went into advanced video options and disabled vsync, but I'm seeing significant tearing as if D3DOverrider vsync isn't applying.

Crysis has its own in engine triple buffering setting (found in its config file) and it won't work with D3DOverrider so create a profile with it disabled for that.

Have you got your driver "V-sync" setting to "Application"?

Both triple buffering and V-sync set to "on."

Tried a higher "detection level"?
 

Ranger X

Member
AtomicShroom said:
Enabling Triple Buffering in ZSnes caused so much input lag I couldn't tolerate it.

Sorry but anything that buffers frames before displaying them gets a massive thumbs down from me.

Most people here couldn't tell about the input lag in most next-gen games (PDZ, Killzone 2, etc.) or even on their shitty LCD TVs, so I'm not expecting almost anyone to sympathize with me, but it seriously annoys me to no fucking end. I can't even stand playing SNES games on the Virtual Console on my friend's LCD TV, even with Game mode on. The input lag is just so bad compared to a good old CRT.

So yeah, fuck off with your triple buffering. The faster drawn frames get displayed, the better.

Prime example: Super Mario Galaxy. No buffering, no tearing, 60fps, no input lag.

Developers should strive to reach a perfect 60fps instead of implementing shitty buffering techniques to make up for their fucking incompetence.



Totally agreed. Again a situation of getting visuals better at the detriment of efficiency/precision in controls. All of this bull when devs CAN make games like the indeed prime example that is called Super Mario Galaxy.

I'll take tearing and whatnot anyday before input lag. tks.

.
 
maus said:
yes as in you're supposed to disable vsync and enable triple buffering?

as far as i know, triple buffering merely allows vsync to operate at a larger variable of framerate, so fps drops aren't as noticable. wikipedia agrees with me.

the OP seems to be talking jargon for the most part.

No, we're getting at is that, yes, you turn on v-sync but that there's little point enabling it ingame when you're forcing it through D3DOverrider already.

It shouldn't actually make much difference in the grand scheme of things though, so don't get caught up on it.
 

maus

Member
hey op you're talking out of your ass.

triple buffering is supposed to be enabled along with vsync.

it is merely a way for lower end hardware to keep up with the limitations of vsync. it is absolutely supposed to be enabled congruently with vsync.

good day.
 
MisterAnderson said:
So disregarding Crysis for now (since I couldn't get the little windows jingle to go off when running it) I tried TF2 and I heard the jingle there... so I went into advanced video options and disabled vsync, but I'm seeing significant tearing as if D3DOverrider vsync isn't applying.

Try restarting TF2. I believe D3DOverrider forces triple buffering at run time, well you may have just overid that setting since then by switching ingame. Leave the setting as is and restart and report back.
 
maus said:
hey op you're talking out of your ass.

triple buffering is supposed to be enabled along with vsync.

it is merely a way for lower end hardware to keep up with the limitations of vsync. it is absolutely supposed to be enabled congruently with vsync.

good day.

Jesus christ, reading comprehension man, reading comprehension!! I know this, said it several times, all I'm saying is that since you're overriding the ingame setting, best not confuse things by forcing v-sync through their as well. It likely won't make a difference anyway, and is a throway point that you're hung up on.

All I'm questioning is the wisdom of turning on "v-sync" twice. Oce should be enough.
 
maus said:
hey op you're talking out of your ass.

triple buffering is supposed to be enabled along with vsync.

it is merely a way for lower end hardware to keep up with the limitations of vsync. it is absolutely supposed to be enabled congruently with vsync.

good day.
brain_stew said:

"Start with Windows" to on
"Detection level" to medium
"Force Triple Buffering" to on
"Force Vsync" to on

Also brain_stew, I managed to get Crysis to do the jingle but only when launching the 32-bit version.

Edit: didn't see your reply regarding TF2, I'll try restarting. And yeah I didn't know Crysis had triple buffering in-game already, figured it didn't since there was no option to enable/disable. Also I actually can't find ANY vsync options in the CCC for my 4890... that's one of the first things I sought out after installing D3Doverrider and couldn't find it.
 

maus

Member
brain_stew said:
No, we're getting at is that, yes, you turn on v-sync but that there's little point enabling it ingame when you're forcing it through D3DOverrider already.

It shouldn't actually make much difference in the grand scheme of things though, so don't get caught up on it.
all vsync + triple buffering does is allow your computer to use a wider variety of framerates rather than act in variables of 2 of your refresh rate settings.

it does not lower latency in any way; in fact i'm reading it adds an additional frame of latency.
 
MisterAnderson said:
Also brain_stew, I managed to get Crysis to do the jingle but only when launching the 32-bit version.

Edit: didn't see your reply regarding TF2, I'll try restarting. And yeah I didn't know Crysis had triple buffering in-game already, figured it didn't since there was no option to enable/disable.

You've got to change a config file variable to enable it.
 
maus said:
all vsync + triple buffering does is allow your computer to use a wider variety of framerates rather than act in variables of 2 of your refresh rate settings.

it does not lower latency in any way; in fact i'm reading it adds an additional frame of latency.

Why don't you read the article this whole thread is based upon, huh? No, it doesn't reduce input lag compared to normal double buffering, never claimed that but yes it does, reduce input lag compared to standard double buffer v-sync. Look at the horse pictures if you want a visual representation, or how about going ahead and reading the op and posts you're replying to?

Not directly related but for those that are sensitive to input latency change your driver setting "maximum prerendered frames" to either 0 or 1.


Edit: MisterAnderson I was right, I booted up TF2 and with driver setting to application, game preset to disabled and both v-sync and triple buffering set to on, it worked smooth as butter. I then tried enabling then imediately disabling v-sync in game and sure enough it started tearing. So, yup, a quick restart of the game should sort it out, all you did was alter the setting forced at runtime. You effectively "overrided" "overrider"! :lol :lol
 
brain_stew said:
Crysis has its own in engine triple buffering setting (found in its config file) and it won't work with D3DOverrider so create a profile with it disabled for that.

Just a clarification when dealing with profiles with D3DOverrider. If I set a global profile with it set to on w/medium application detection, will that global profile override any profiles set with it off? As in do I need to make a profile for each application I plan running D3DOverrider with it set to on or can I just have the global setting set to on, and then switch it off on the apps I don't want it running (Crysis).

Also if you could tell me how to change the Crysis config file to enable triple buffering that would be great as I'd love to test it out, as I tried searching and can't find anything telling me how to do that.

Edit:
brain_stew said:
MisterAnderson I was right, I booted up TF2 and with driver setting to application, game preset to disabled and both v-sync and triple buffering set to on, it worked smooth as butter.

I can't for the life of me find the driver settings in the Catalyst Control Center for making Vsync application driven. Am I blind?
 

maus

Member
brain_stew said:
Why don't you read the article this whole thread is based upon, huh? No, it doesn't reduce input lag compared to normal double buffering, never claimed that but yes it does, reduce input lag compared to standard double buffer v-sync. Look at the horse pictures if you want a visual representation, or how about going ahead and reading the op and posts you're replying to?

well if you read the comments of the article you posted you'll see that the issue is much more complicated than what you've stated.
 
MisterAnderson said:
Just a clarification when dealing with profiles with D3DOverrider. If I set a global profile with it set to on w/medium application detection, will that global profile override any profiles set with it off? As in do I need to make a profile for each application I plan running D3DOverrider with it set to on or can I just have the global setting set to on, and then switch it off on the apps I don't want it running (Crysis).

Also if you could tell me how to change the Crysis config file to enable triple buffering that would be great as I'd love to test it out, as I tried searching and can't find anything telling me how to do that.

Edit:

I can't for the life of me find the driver settings in the Catalyst Control Center for making Vsync application driven. Am I blind?

Oh, I'm on Nvidia here, don't know what its called on the ATI side, its probably the default setting.
 
maus said:
well if you read the comments of the article you posted you'll see that the issue is much more complicated than what you've stated.

Is it an oversimplification? Sure. Is the takeaway fact that gamers that want to get rid of tearing are much better off in terms of performance and input lag using tripple buffer v-sync over double buffer v-sync so long as they have the V-RAM to spare (which they more than likely do) still true? Absolutely. I don't consider tearing a solution, a torn frame is a lost frame as far as I'm concerned, broken incomprehensible and glitchy, so yes, tripple buffer v-sync really is the best solution in most cases as far as I'm concerned.
 
brain_stew said:
Oh, I'm on Nvidia here, don't know what its called on the ATI side, its probably the default setting.

Yeah, I'm pretty sure it's only application driven because I've checked 3 times and can't find it. And I'm 90% sure after restarting TF2 that it's working now, I'm just too tired to be 100% because it looks like it could be tearing but I'm pretty sure it isn't... I'm just delirious methinks.
 

Truespeed

Member
maus said:
yes as in you're supposed to disable vsync and enable triple buffering?

as far as i know, triple buffering merely allows vsync to operate at a larger variable of framerate, so fps drops aren't as noticable. wikipedia agrees with me.

the OP seems to be talking jargon for the most part.

Wikipedia said:
Another method of triple buffering involves synchronizing with the monitor frame rate, and simply using the third buffer as a method of providing breathing room for changing demands in the amount of graphics drawn. This is the use of a buffer in the true sense whereby the buffer acts as a reservoir. Such a method requires a higher minimum specification of the target hardware but provides a consistent (vs. variable) frame rate. This is the case when using triple buffering in DirectX, where a chain of 3 buffers are rendered and always displayed.

That's the way I understand it. According to Wikipedia, the triple buffering solution requires the use of vsync in order for it to work.
 
Truespeed said:
That's the way I understand it. According to Wikipedia, the triple buffering solution requires the use of vsync in order for it to work.

Honestly, how or why it works really isn't all that important. Its nice to learn about , sure but that wasn't the major aim of this thread anyway. I'm not going to pretend to be an expert on the way graphics rendering works but I have learned enough here to get some useful, practical advice to improve my ingame experience and I'm simply passing that on.

The fact is, PC gamers that don't like tearing can get a serious framerate boost in many games by enabling tipple buffering over the standard v-sync they've been using in the past. I've let people know how to do that, and if one person or more benefits from it then great, this thread has served its purpose.

It seems some took issue with the thread title, a title that was simply a copy of the title used in the article this thread is based upon.
 

Truespeed

Member
MisterAnderson said:
I can't for the life of me find the driver settings in the Catalyst Control Center for making Vsync application driven. Am I blind?

It's called "Wait for Vertical Refresh" in the All Settings section of the ATI control panel.

All Settings: If you find that changing settings with the Preview screen visible is annoying you, or if you simply want to see all the settings in one summary interface, then you might want to adjust your 3D settings using this option. Just click this option and in place of the Preview screen will appear all the setting sliders and tick boxes for you to adjust as discussed on the previous page. Importantly, there is a significant additional option under the All Settings section which is not available elsewhere in the Control Center:

Wait for Vertical Refresh: This option controls Vertical Synchronization (VSync) in games. VSync is the synchronization of your graphics card and monitor's abilities to redraw the screen a number of times each second (measured in FPS or Hz), and is explained in greater detail on this page of the Gamer's Graphics & Display Settings Guide. There are four choices here:

Always Off - Vertical Sync will always be set to Off, regardless of the setting in the game or 3D application. This provides fastest performance but may result in image 'tearing' which can be annoying to some.

Off, Unless Application Specifies - Vertical Sync will be off by default, however if you choose to enable it in a game or 3D application, it will be enabled for that game/app. This is the recommended mode.

On, Unless Application Specifies - Same as above, except Vertical Sync will be on by default unless otherwise disabled in a particular game or 3D application.

Always On - Vertical Sync will always remain on, regardless of the setting in the game or 3D application. This is not recommended, as it may reduce performance however it guarantees that there will never be any screen "tearing".
 
Truespeed said:
It's called "Wait for Vertical Refresh" in the All Settings section of the ATI control panel.

Basically I'm an idiot and didn't realize there was a vertical scroll bar under the "all settings" section. Lol.
 

Truespeed

Member
brain_stew said:
Honestly, how or why it works really isn't all that important. Its nice to learn about , sure but that wasn't the major aim of this thread anyway. I'm not going to pretend to be an expert on the way graphics rendering works but I have learned enough here to get some useful, practical advice to improve my ingame experience and I'm simply passing that on.

The fact is, PC gamers that don't like tearing can get a serious framerate boost in many games by enabling tipple buffering over the standard v-sync they've been using in the past. I've let people know how to do that, and if one person or more benefits from it then great, this thread has served its purpose.

We should also remember that the large majority of people playing these games won't notice any screen tearing until they're told what to look for. And even then they probably couldn't spot it so it's really subjective, I guess. It's only the vocal minority that seems to make an issue out of it. But, how exactly does triple buffering improve your framerate? Aren't you sacrificing framerate to avoid tearing? If you're displaying 3 buffers instead of 2 wouldn't it take longer and therefore decrease the number of frames that can be displayed within 1 second?
 

Truespeed

Member
MisterAnderson said:
Basically I'm an idiot and didn't realize there was a vertical scroll bar under the "all settings" section. Lol.

No, it's not you. It's ATI's lame .NET Control Center.
 
Truespeed said:
We should also remember that the large majority of people playing these games won't notice any screen tearing until they're told what to look for. And even then they probably couldn't spot it so it's really subjective, I guess. It's only the vocal minority that seems to make an issue out of it. But, how exactly does triple buffering improve your framerate? Aren't you sacrificing framerate to avoid tearing? If you're displaying 3 buffers instead of 2 wouldn't it take longer and therefore decrease the number of frames that can be displayed within 1 second?

I honestly can't comprehend how anyone can not see a game tearing, the screen literally sheers in half, I think more people notice it than you think. Even if they don't know what it is they'll see it and be able to appreciate it if it was removed.

You're increasing your performance (often considerably) over normal v-sync is my point, performance will be basically on par with normal rendering without any v-sync, whereas v-sync can slash your framerate by as much as half.
 

SappYoda

Member
I think the reason 360 and PS3 lack triple buffering may be this

The third buffer also uses additional memory, which could be used for other data (like textures).

If this is true, games would need to sacrifice graphics in order to remove tearing.
 

loganclaws

Plane Escape Torment
One more question, will enabling triple buffering+vsync from the program limit fps to the monitor's refresh rate just like normal vsync (from the videocards control panel)? Are you only talking about reduced input lag as a benefit?
 
loganclaws said:
One more question, will enabling triple buffering+vsync from the program limit fps to the monitor's refresh rate just like normal vsync (from the videocards control panel)? Are you only talking about reduced input lag as a benefit?

Yes.
 

dLMN8R

Member
Wow, what a pathetically shitload of stupid in this thread.

-Triple buffering reduces mouse lag with vsync on compared to double buffering. The article explains exactly why.
-Results from triple buffering in fucking SNES emulators are not representative of what triple buffering actually does in modern 3D games.


Anandtech knows their shit. Especially Derek Wilson, who happens to be that article's author. Apparently people on this message board not only speak out of their ass, but can't read either :lol
 

Truespeed

Member
dLMN8R said:
Wow, what a pathetically shitload of stupid in this thread.

-Triple buffering reduces mouse lag with vsync on compared to double buffering. The article explains exactly why.
-Results from triple buffering in fucking SNES emulators are not representative of what triple buffering actually does in modern 3D games.


Anandtech knows their shit. Especially Derek Wilson, who happens to be that article's author. Apparently people on this message board not only speak out of their ass, but can't read either :lol

Excellent point. The pervasiveness of triple buffering in today's games is further justification that "Derek Smith" is smarter than all game developers.
 
Truespeed said:
Excellent point. The pervasiveness of triple buffering in today's games is further justification that "Derek Smith" is smarter than all game developers.

In this regard, yeah, he absolutely is. There is zero reason to leave out a triple buffering setting out of any modern PC game, there's just no solid argument for it at all. Its an incredibly small amount of work that can see the framerate of many of your end users increase by upto 50%. If a tiny bit of "optimisation" with benefits as large as that can't be justified then why optimise your game engine at all? Gamers shouldn't have to resort to workarounds. On the console side, sure things are different given the anemic RAM to play with, but I'll be damned if its not a better solution than the clusterfuck of shit that is constant tearing which we see in far too many modern games.

Red Faction, Ghostbusters and Bionic Commando are just three high profile recent examples of console games where we're seeing as much as 50% of frames suffering from tearing. Any game where 50% or more of the visual information presented to the player is not in a comprehensible state is not a fit product, it really isn't.
 
I was playing a bunch of games today with triple buffering, Crysis seemed nice, but everything else I played was tearing, not terrible tearing, but still tearing none-the-less. I guess triple buffering doesn't play well with all.

I should have Triple Buffering on, and V-Sync off, yes?
 

Grayman

Member
TouchMyBox said:
I was playing a bunch of games today with triple buffering, Crysis seemed nice, but everything else I played was tearing, not terrible tearing, but still tearing none-the-less. I guess triple buffering doesn't play well with all.

I should have Triple Buffering on, and V-Sync off, yes?
no
 
TouchMyBox said:
I was playing a bunch of games today with triple buffering, Crysis seemed nice, but everything else I played was tearing, not terrible tearing, but still tearing none-the-less. I guess triple buffering doesn't play well with all.

I should have Triple Buffering on, and V-Sync off, yes?

No, have D3DOverrider force both V-Sync and triple buffering as described in the op. You shouldn't be seeing any tearing.
 

bee

Member
dLMN8R said:
Wow, what a pathetically shitload of stupid in this thread.

-Triple buffering reduces mouse lag with vsync on compared to double buffering. The article explains exactly why.
-Results from triple buffering in fucking SNES emulators are not representative of what triple buffering actually does in modern 3D games.


Anandtech knows their shit. Especially Derek Wilson, who happens to be that article's author. Apparently people on this message board not only speak out of their ass, but can't read either :lol

i don't need an article to tell me what i've known for 10 years, triple buffer is lag city in fps games, easiest way to show it is go and play left4dead with vsync disabled then enabled then triple buffered and its clear as day

well unless you play on some shitty lcd monitor or tv with huge amounts of input lag already like any recent samsung tv/100hz sony tv or a monitor like a dell 2408
 

Ikuu

Had his dog run over by Blizzard's CEO
I just messed around in L4D and to be honest I couldn't see much of a difference in the input regardless of the setting.
 

dLMN8R

Member
bee said:
i don't need an article to tell me what i've known for 10 years, triple buffer is lag city in fps games, easiest way to show it is go and play left4dead with vsync disabled then enabled then triple buffered and its clear as day

well unless you play on some shitty lcd monitor or tv with huge amounts of input lag already like any recent samsung tv/100hz sony tv or a monitor like a dell 2408
That's odd, because Left 4 Dead is the only game I have where I can enable VSync without noticing any mouse lag. In every other game, vsync causes a ton of mouse lag, specifically because they don't support triple buffering like L4D does natively.
 

Slavik81

Member
brain_stew said:
In this regard, yeah, he absolutely is. There is zero reason to leave out a triple buffering setting out of any modern PC game, there's just no solid argument for it at all. Its an incredibly small amount of work that can see the framerate of many of your end users increase by upto 50%. If a tiny bit of "optimisation" with benefits as large as that can't be justified then why optimise your game engine at all? Gamers shouldn't have to resort to workarounds. On the console side, sure things are different given the anemic RAM to play with, but I'll be damned if its not a better solution than the clusterfuck of shit that is constant tearing which we see in far too many modern games.
Most people don't notice it.

When I was playing Assassin's Creed PS3, I sat there flicking the camera back and forth trying to show them screen tearing. While it was fairly obvious to me, they never saw what I was talking about.
 
bee said:
i don't need an article to tell me what i've known for 10 years, triple buffer is lag city in fps games, easiest way to show it is go and play left4dead with vsync disabled then enabled then triple buffered and its clear as day

well unless you play on some shitty lcd monitor or tv with huge amounts of input lag already like any recent samsung tv/100hz sony tv or a monitor like a dell 2408

If you're super sensitive to input lag, I don't know how you can see the option of triple buffering as a bad thing. If you don't mind tearing then fair enough don't use it, but if you're playing a game where tearing is particularly bad, at least triple buffering gives you an option to get rid of it with reduced input latency compared to the other alternative of standard double buffer v-sync.

I'm not trying to claim its the perfect solution for every case and every gamer, though I do believe it is actually the best setting for most gamers and is absolutely an improvement over double buffer v-sync in almost every case so long as you can spare an extra 10-20MB of VRAM.

If input lag is an issue for you then the first thing you should be doing is changing the "number of prerendered frames" setting in your drivers to 0 or 1. Its very likely that tripple buffering plus zero prerendered frames will have less input lag than standard double buffering with the default setting of 3/4.
 
Slavik81 said:
Most people don't notice it.

When I was playing Assassin's Creed PS3, I sat there flicking the camera back and forth trying to show them screen tearing. While it was fairly obvious to me, they never saw what I was talking about.

I find that really hard to believe , I think its more that they ignore it and deal with it. We're talking about a situation where someone is blocking out 50% of the visual information fed to them. How in the hell can you play a video game if you only react upon or notice 50% of the visual data on screen? Its the equivalent of playing a game with your eyes squinted.
 

DaveKap

Banned
Has there ever been a full layman's terms explanation for all those little switches you can flip on and off in an nVidia or ATi control panel? I already had triple buffering enabled for all games, but I really wonder what some of these other features will give me in terms of performance vs quality.
 
brain_stew said:
I find that really hard to believe , I think its more that they ignore it and deal with it. We're talking about a situation where someone is blocking out 50% of the visual information fed to them. How in the hell can you play a video game if you only react upon or notice 50% of the visual data on screen? Its the equivalent of playing a game with your eyes squinted.
I think everyone can see it if you show them what it is. I don't think they "deal" with it, as in learn to ignore it though.

It just dosen't mean anything to them, I have pointed screen-tearing out to many people and normally they just say I am nit-picking. They see it, but it means nothing to them.
 

Truespeed

Member
brain_stew said:
In this regard, yeah, he absolutely is. There is zero reason to leave out a triple buffering setting out of any modern PC game, there's just no solid argument for it at all. Its an incredibly small amount of work that can see the framerate of many of your end users increase by upto 50%. If a tiny bit of "optimisation" with benefits as large as that can't be justified then why optimise your game engine at all? Gamers shouldn't have to resort to workarounds. On the console side, sure things are different given the anemic RAM to play with, but I'll be damned if its not a better solution than the clusterfuck of shit that is constant tearing which we see in far too many modern games.

Red Faction, Ghostbusters and Bionic Commando are just three high profile recent examples of console games where we're seeing as much as 50% of frames suffering from tearing. Any game where 50% or more of the visual information presented to the player is not in a comprehensible state is not a fit product, it really isn't.

If there's "zero reason" to leave out a triple buffering setting than why doesn't every game engine support it inherently? Do you really think it's a technique that has escaped the grasp of their technical leads? Have you ever asked yourself why the majority of games don't support it? If you really think triple buffering is the solution to tearing then you really need to find out why the technique is basically ignored to find out your answer.
 

maus

Member
MvmntInGrn said:
I think everyone can see it if you show them what it is. I don't think they "deal" with it, as in learn to ignore it though.

It just dosen't mean anything to them, I have pointed screen-tearing out to many people and normally they just say I am nit-picking. They see it, but it means nothing to them.
yeah but it's pretty much comparable to 30fps vs 60fps. while it might not be at the forefront of their mind, i think they'd be able to tell if they played the two side by side. most console devs typically choose to enable vsync in favor of not having it, if the hardware permits.

also, i was a drunk and surly nerd arguing in this thread last night so i gotta apologize to brain_stew for being an ass and not really comprehending his posts. i was just going off my personal experience and frustration dealing with triple buffering/vsync/input lag, which i haven't fully solved for some games. i just wish more games would let you cap the fps, since tearing isn't very noticable to me at 60fps and below.
 

bee

Member
well i don't believe triple buffering offers less input lag than normal vsync at all, anandtech can say what they want i know the delay i feel

in fps games its a total non option, if you're fine with sacrificing accuracy then you may aswell play with a joypad

as i said before in other gametypes i don't mind it really, although i also don't use it in sim racers either
 
Truespeed said:
If there's "zero reason" to leave out a triple buffering setting than why doesn't every game engine support it inherently? Do you really think it's a technique that has escaped the grasp of their technical leads? Have you ever asked yourself why the majority of games don't support it? If you really think triple buffering is the solution to tearing then you really need to find out why the technique is basically ignored to find out your answer.

You tell me? If a single hobbyist can create a program for free that forces it in over 90% of games, then, yes its something that can easily be added to the majority of games.
 
maus said:
yeah but it's pretty much comparable to 30fps vs 60fps. while it might not be at the forefront of their mind, i think they'd be able to tell if they played the two side by side. most console devs typically choose to enable vsync in favor of not having it, if the hardware permits.

also, i was a drunk and surly nerd arguing in this thread last night so i gotta apologize to brain_stew for being an ass and not really comprehending his posts. i was just going off my personal experience and frustration dealing with triple buffering/vsync/input lag, which i haven't fully solved for some games. i just wish more games would let you cap the fps, since tearing isn't very noticable to me at 60fps and below.

Have you tried altering the prerendered frames setting in your drivers?
 
Top Bottom