• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Triple buffering: "Why we love it"

Syril

Member
Hmm. Maybe that's why vsync never seems to do anything with me. I always turned on triple buffering because I figured it was some kind of detail setting.
 

Truespeed

Member
brain_stew said:
You tell me? If a single hobbyist can create a program for free that forces it in over 90% of games, then, yes its something that can easily be added to the majority of games.

The very reason that triple buffering is ignored is the answer. Meta isn't it? The tech leads have done their analysis and have basically come to the conclusion the benefits do not overcome its detriment to frame rate and other performance factors. However, it would it be nice if they added it as a disabled by default option to appease those that are comfortable with reduced fame rates (a phenomena more people notice than screen tearing).
 

epmode

Member
Truespeed said:
The very reason that triple buffering is ignored is the answer. Meta isn't it? The tech leads have done their analysis and have basically come to the conclusion the benefits do not overcome its detriment to frame rate
Wait, wat?
 
Vsync with or without triple buffering has always introduced intolerable amounts of input lag for me, possibly compounded by my Logitech G5 running at 2000 dpi. Just tried D3DOverrider with CS:S and TF2 and had the same results as always. Setting prerender limit to 1 has never reduced lag for me either. (???)
 
Truespeed said:
The very reason that triple buffering is ignored is the answer. Meta isn't it? The tech leads have done their analysis and have basically come to the conclusion the benefits do not overcome its detriment to frame rate and other performance factors. However, it would it be nice if they added it as a disabled by default option to appease those that are comfortable with reduced fame rates (a phenomena more people notice than screen tearing).

If you're going to follow this line of thinking then why the hell does any game include a double buffer v-sync option then? Nigh, on every game out there has an option to enable v-sync ingame, but according to you logic this is surely not possible? Triple buffering is about giving better performance and less lag for those that want to get rid of tearing, not less.

The reduction in actual performance between a standard double buffered game and a triple buffered v-synced game is very small, that's the whole freaking point of the method, to enable vertical synchronisation without the costly performance and input lag cost of bog standard double buffer v-sync.

What in the hell is wrong with giving the end user options? This is PC gaming, that's what its all about about. Plenty of games don't include antialiasing support (and no, I'm not talking about DX9 games that use deferred rendering, they have an excuse even if it can be overcome) or widescreen support (or at least they never use to) even if their engine is fuly capable of it. Its just shitty practice, and nothing to do with tech leads analysing performance factors, its developers pushing games out the door without basic functionality that can very simply be added but aren't.
 

dLMN8R

Member
Truespeed said:
If your double buffered game is rendering at 60fps what impact do you think rendering a third buffer would have?
Jesus....of all the times in this thread you've proven that you don't know a damn thing about what you're talking about, this one takes the cake :lol :lol
 
Truespeed said:
If your double buffered game is rendering at 60fps what impact do you think rendering a third buffer would have?

How in the hell can you guarantee your game runs at 60fps all the time on an open platform? The answer? You can't. What do PC games use to cap their maximum framerate? Yup its double buffer v-sync, why not give users the option to use tripple bufering so any dips are much less notcable?

How many console games are actually a locked 60fps these days? Its certainly less than 5%, so its hardly an example to set a rule by now is it?
 

Grayman

Member
Truespeed said:
If your double buffered game is rendering at 60fps what impact do you think rendering a third buffer would have?
It shouldn't have a large effect. My understanding from the article was that your frames hit the two backbuffers nearly as regular rendering, there are no "extra" frames being rendered here, as long as render time is <16ms you are not losing performance.
 

dLMN8R

Member
Grayman said:
It shouldn't have a large effect. My understanding from the article was that your frames hit the two backbuffers nearly as regular rendering, there are no "extra" frames being rendered here, as long as render time is <16ms you are not losing performance.
The difference between Double and Triple-buffering has absolutely NO inherent effect on performance. It does not tax the video card any more to "double" or "triple" buffer a game. The only difference whatsoever is roughly 30MB of video memory or so, enough to store that extra frame buffer. On a video card with 512MB or more, that's negligeable.


But then...if people actually read the Anandtech article...everyone would know this already.
 
Grayman said:
It shouldn't have a large effect. My understanding from the article was that your frames hit the two backbuffers nearly as regular rendering, there are no "extra" frames being rendered here, as long as render time is <16ms you are not losing performance.

This is true. Honestly I don't know why performance is being brought into this as a negative, the whole freaking point of using triple buffering is to increase performance over standard double buffer v-sync. Performance will be just about on par with standard non v-synced double dbuffer rendering, even if FRAPS doesn't report as much (hint: its not designed to accurately measure the framerate of a triple buffered game).
 
dLMN8R said:
The difference between Double and Triple-buffering has absolutely NO inherent effect on performance. It does not tax the video card any more to "double" or "triple" buffer a game. The only difference whatsoever is roughly 30MB of video memory or so, enough to store that extra frame buffer. On a video card with 512MB or more, that's negligeable.


But then...if people actually read the Anandtech article...everyone would know this already.

That 30MB figure is actually on the high side of the estimate, and is taking into account high PC resolutions and IQ. In a console environment at 720p/ no aa, the hit would be less than 10MB iirc. Sure, every last stash of RAM is precious in console development but sacrificing a single digit amount to get rid of tearing without taking any performance hit sounds fine to me.
 

Truespeed

Member
brain_stew said:
If you're going to follow this line of thinking then why the hell does any game include a double buffer v-sync option then? Nigh, on every game out there has an option to enable v-sync ingame, but according to you logic this is surely not possible? Triple buffering is about giving better performance and less lag for those that want to get rid of tearing, not less.

The reduction in actual performance between a standard double buffered game and a triple buffered v-synced game is very small, that's the whole freaking point of the method, to enable vertical synchronisation without the costly performance and input lag cost of bog standard double buffer v-sync.

What in the hell is wrong with giving the end user options? This is PC gaming, that's what its all about about. Plenty of games don't include antialiasing support (and no, I'm not talking about DX9 games that use deferred rendering, they have an excuse even if it can be overcome) or widescreen support (or at least they never use to) even if their engine is fuly capable of it. Its just shitty practice, and nothing to do with tech leads analysing performance factors, its developers pushing games out the door without basic functionality that can very simply be added but aren't.

I agree totally with the point of giving your users options to tweak the rendering of the game to a level that's acceptable to them. Giving the user options is never a bad thing. The issue I have is that if the process was such a proven solution to alleviate screen tearing then we would have seen this technique used in all games. Developers want their games to look the best they can possibly look. Do you think they're actually happy with shipping a game that has screen tearing or do you think they just don't care about it and are too lazy to come up with a solution? The only solution to screen tearing is to not render the new screen until the previous is fully displayed. Depending on the complexity of what they're trying to render this will reduce the frame rate and introduce lag. And the lag is considerably more perceivable than screen tearing that I maintain is not to the majority of people. It's one or the other and the technical leads have obviously favored blasting out as many frames as they can regardless of vsync.
 

Grayman

Member
I was going to play around with this myself but I hardly notice tearing, i think i've seen it in uncharted once or twice when trying to make it happen. I was going to do the lag test but I could not even get quake 4 to double buffer(or go below 60fps at all when I had vsync on) and did not feel like playing game setting carousel more. At 60hz, would 125fps tear at the top or bottom?
 
Truespeed said:
I agree totally with the point of giving your users options to tweak the rendering of the game to a level that's acceptable to them. Giving the user options is never a bad thing. The issue I have is that if the process was such a proven solution to alleviate screen tearing then we would have seen this technique used in all games. Developers want their games to look the best they can possibly look. Do you think they're actually happy with shipping a game that has screen tearing or do you think they just don't care about it and are too lazy to come up with a solution? The only solution to screen tearing is to not render the new screen until the previous is fully displayed. Depending on the complexity of what they're trying to render this will reduce the frame rate and introduce lag. And the lag is considerably more perceivable than screen tearing that I maintain is not to the majority of people. It's one or the other and the technical leads have obviously favored blasting out as many frames as they can regardless of vsync.

It completely removes screen tearing and gives better performance than the only other alternative. How much more proven does the technique need to be?

Antialisaing has been proven to minimise aliasing, yet plenty of games still ship today without the option. Does that mean that antialiasing techniques aren't proven to reduce aliasing?

Come on man, your logic is all out of whack. Yes developers can fuck up, and yes its completely possible for the editor of the most respected technology site on the internet to have a better grasp of rendering technology than your average game programmer.

I still don't think you've grasped why the fuck me and and others are a fan of trippe buffering yet. There's two solutions to eliminate tearing in games. So long as you can spare 10- 20MB or so tripple buffering is unquestionably the better solution, there's no debate here, its a simple fact proven in real tests, and backed up with mathematics.

Developers currently only choose to implement the provably inferior method of eliminating tearing, this is a shitty situation and should be changed. I don't know what's to debate, honestly.
 

Truespeed

Member
brain_stew said:
How in the hell can you guarantee your game runs at 60fps all the time on an open platform? The answer? You can't. What do PC games use to cap their maximum framerate? Yup its double buffer v-sync, why not give users the option to use tripple bufering so any dips are much less notcable?

How many console games are actually a locked 60fps these days? Its certainly less than 5%, so its hardly an example to set a rule by now is it?

By specifying your PC game has a minimum specification of CPU and GPU performance like most games do nowadays? As for how many console games are locked at 60fps, well not many because the sacrifices you need to make to achieve 60fps are considerable unless, of course, you have the coding prowess of a company like Criterion games that understands that art assets, geometry and code need to be architected to support the target framerate. But, bringing consoles into this is rather moot because given the current RAM footprint frustrations there is no way they would even contemplate introducing a third buffer to eat spare memory they don't have.
 

epmode

Member
Truespeed said:
If your double buffered game is rendering at 60fps what impact do you think rendering a third buffer would have?
The entire point of the option is to INCREASE framerate over standard vsync with no tearing. And it works. Jesus Christ.

BTW, I immediately catch on to screen tearing, yet I can easily deal with any additional input lag caused by triple buffering in 90% of everything I've tried.
 
Grayman said:
I was going to play around with this myself but I hardly notice tearing, i think i've seen it in uncharted once or twice when trying to make it happen. I was going to do the lag test but I could not even get quake 4 to double buffer(or go below 60fps at all when I had vsync on) and did not feel like playing game setting carousel more. At 60hz, would 125fps tear at the top or bottom?

If you don't care about tearing at all then tripple buffering offers nothing to you, I'd never claim it does. I happen to think tearing is the worst visual artefact in modern gaming, so that's why triple buffering support is a big deal to me, I shouldn't have to put up with extra lag and a worse framerate, because someone forgot to spent an extra half an hour coding an option into their game's menu.
 
Truespeed said:
By specifying your PC game has a minimum specification of CPU and GPU performance like most games do nowadays? As for how many console games are locked at 60fps, well not many because the sacrifices you need to make to achieve 60fps are considerable unless, of course, you have the coding prowess of a company like Criterion games that understands that art assets, geometry and code need to be architected to support the target framerate. But, bringing consoles into this is rather moot because given the current RAM footprint frustrations there is no way they would even contemplate introducing a third buffer to eat spare memory they don't have.
I thought Uncharted 2 was using Triple Buffering?

Of course it is in the vast minority though. :lol
 
Truespeed said:
By specifying your PC game has a minimum specification of CPU and GPU performance like most games do nowadays? As for how many console games are locked at 60fps, well not many because the sacrifices you need to make to achieve 60fps are considerable unless, of course, you have the coding prowess of a company like Criterion games that understands that art assets, geometry and code need to be architected to support the target framerate. But, bringing consoles into this is rather moot because given the current RAM footprint frustrations there is no way they would even contemplate introducing a third buffer to eat spare memory they don't have.

Oh, come the fuck on, them specifications have meant sweet fa for well overa decade now and you know it. They sure as hell don't mean you'll have a guaranteed 60fps in all situations, that's for damn sure. Even then, its impossible to test every hardware combination, and in order to ensure what you want that's exactly what you'd have to do, because the bottlenecks will be diffrent in every system available on the market.

Then why did Capcom do just that with RE5 confirmed as much in post release interviews? Did anyone mention increased input lag in the PS3 version of RE5? Absolutely not, so that's a test case that proves that any extra lag from triple buffering is often transparent to the end user. Apparently Naughty Dog are doing the same with Uncharted (unconfirmed). We're talking a single digit amount of RAM here with console resolutions and IQ. I.E. less than 2% of total RAM, that is no where near as big a compromise as you're making it out to be. Heck, many 360 developers including Bungie were losing as much as 3-5x that with old dev kits that only had 512MB of RAM and thus no extra space to store debug data.
 

Truespeed

Member
MvmntInGrn said:
I thought Uncharted 2 was using Triple Buffering?

Of course it is in the vast minority though. :lol

Well, a citation would be neat, but if anyone could do it it's ND. The one criticism that detractors would always use to put down Uncharted 1 was the screen tearing. I played though the game and never really noticed it, though.
 

Truespeed

Member
epmode said:
The entire point of the option is to INCREASE framerate over standard vsync with no tearing. And it works. Jesus Christ.

BTW, I immediately catch on to screen tearing, yet I can easily deal with any additional input lag caused by triple buffering in 90% of everything I've tried.

Great. Have you given any thought to sharing your discovery with engine developers? Do you think you've stumbled upon something they have no knowledge or idea of? Why do you think they haven't implemented this and why this technique isn't pervasive in all games?
 
Truespeed said:
Well, a citation would be neat, but if anyone could do it it's ND. The one criticism that detractors would always use to put down Uncharted 1 was the screen tearing. I played though the game and never really noticed it, though.

Its in Japanese the quotes from Capcom but some guys at Beyond3D translated a lot of it. I can't find the article right now, but framerate analysis in this thread confirms it:

http://forum.beyond3d.com/showthread.php?p=1302899
 
Truespeed said:
Great. Have you given any thought to sharing your discovery with engine developers? Do you think you've stumbled upon something they have no knowledge or idea of? Why do you think they haven't implemented this and why this technique isn't pervasive in all games?

Developers make shitty technology decisions all the time. The fact that sub HD resolutions and unstable 30fps framerates are par for the course this gen. should be more than enough to tell you that.

Fuck, Ether Snake was describing to me the other day how many of his fellow graphics programmers didn't even understand the concept and benefits of ambient occlusion ffs. I'm a casual end user and even I grasp that, being in a job at a game developer doesn't mean you have a perfect understanding and grasp of every single rendering technique nor should it. The fact that, the two most technically competent console development houses around seem to be using triple buffering in their PS3 work should tell you that its a technique that has merit.
 

maus

Member
i just tried some tf2 with trip buffering and the input lag was maddening. the source engine seems to be worse than most other engines with this type of thing; the original half-life is not nearly as bad..

at least it lets me cap the fps in console because it tears like crazy when you start getting above 60fps.
 

Truespeed

Member
maus said:
i just tried some tf2 with trip buffering and the input lag was maddening. the source engine seems to be worse than most other engines with this type of thing; the original half-life is not nearly as bad..

at least it lets me cap the fps in console because it tears like crazy when you start getting above 60fps.

Haven't you been reading this thread? Triple buffering is supposed to increase your framerate :lol
 
Truespeed said:
Haven't you been reading this thread? Triple buffering is supposed to increase your framerate :lol

Evidently you haven't. Fuck, I give up, if you don't have basic reading comprehension then I'm just wasting my time.


Edit: God damn it, why didn't I check this guys post history before getting into this! :lol if I'd have know he was part of the PS3/KZ2 crazy brigade I would have known not to have bothered. Any fucker that comes out with this gem deserves to be laughed at in a public setting, and all I was trying to do was have a reasoned technical debate as well as provide help to GAF's PC gamers.

Truespeed said:
While we're at it - I think it's time to re-open the investigation in all 19 of those PS3 / 360 face-off's. I clearly think there was manipulation, payola and major bias involved.
 

Truespeed

Member
brain_stew said:
Developers make shitty technology decisions all the time. The fact that sub HD resolutions and unstable 30fps framerates are par for the course this gen. should be more than enough to tell you that.

Fuck, Ether Snake was describing to me the other day how many of his fellow graphics programmers didn't even understand the concept and benefits of ambient occlusion ffs. I'm a casual end user and even I grasp that, being in a job at a game developer doesn't mean you have a perfect understanding and grasp of every single rendering technique nor should it. The fact that, the two most technically competent console development houses around seem to be using triple buffering in their PS3 work should tell you that its a technique that has merit.

Agreed, but I consider the Naughty Dog engine to be one of the best, if not the best, engine for the PS3 and people picked apart Uncharted for its alleged screen tearing. Did ND make bad decisions at the time? I really don't think so. They did the best they could at the time. If the ND 2 engine is indeed using triple buffering, in a PS3 game no less, to alleviate screen tearing then my admiration for their technical prowess will have increased dramatically. I still consider this unsubstantiated, but triple buffering in a game as complex as Uncharted 2 is major when you consider developers are constantly slamming the PS3 for its O/S constrained RAM footprint.
 

epmode

Member
Truespeed said:
Great. Have you given any thought to sharing your discovery with engine developers? Do you think you've stumbled upon something they have no knowledge or idea of? Why do you think they haven't implemented this and why this technique isn't pervasive in all games?
Man, if only I could show you the difference in Trine on my computer. It's entirely reproducible, too.

The framerate difference is blindingly obvious. I couldn't care less about why it's not in by Direct3D by default.

edit: Actually, I COULD care less about D3D defaults. I COULDN'T care less about console developers and their use of the technology.
 

Truespeed

Member
brain_stew said:
Evidently you haven't. Fuck, I give up, if you don't have basic reading comprehension then I'm just wasting my time.

Dude, don't take it personal. I just disagree with you. Deal with it and try not to make it personal. Now, build me a system for $500 that gets 60fps in Burnout.
 
epmode said:
Man, if only I could show you the difference in Trine on my computer. It's entirely reproducible, too.

Mate, he's part of the Playstation crazy brigade, just check out his post history. Arguing with reasoned technically and mathematically backed up factual evidence won't do shit for these guys. Don't bother, I've already wasted too much time on him.
 
Truespeed said:
Dude, don't take it personal. I just disagree with you. Deal with it and try not to make it personal. Now, build me a system for $500 that gets 60fps in Burnout.

No, you disagree with well researched, and mathematically proven evidence. There's a difference here. And, um, my $500 config will do that (probably at 1080p even) just fine.
 

Truespeed

Member
brain_stew said:
Evidently you haven't. Fuck, I give up, if you don't have basic reading comprehension then I'm just wasting my time.


Edit: God damn it, why didn't I check this guys post history before getting into this! :lol if I'd have know he was part of the PS3/KZ2 crazy brigade I would have known not to have bothered. Any fucker that comes out with this gem deserves to be laughed at in a public setting, and all I was trying to do was have a reasoned technical debate as well as provide help to GAF's PC gamers.

Apparently, your humor comprehension isn't very good either. You really are an idiot for not being able to pick that up considering the context it was presented in.
 

Truespeed

Member
brain_stew said:
No, you disagree with well researched, and mathematically proven evidence. There's a difference here. And, um, my $500 config will do that (probably at 1080p even) just fine.

This has become silly. If you really think triple buffering is the panacea to screen tearing then more power to you. We'll just have to disagree.

By the way, since you're researching me, did you also come across this little gem?

Truespeed said:
This is fucking bullshit. I just bought MT Framework 1.0 and now they're releasing MT Framework 2.0 so soon? This is the last time I buy a framework from Capcom. They're fucking bifurcating the community with this money grabbing stunt. I'll post a link to a online protest page when I create it so that we can send Capcom a message.

See what I did there?

On second thought, probably not.
 

epmode

Member
Truespeed said:
This has become silly. If you really think triple buffering is the panacea to screen tearing then more power to you. We'll just have to disagree.
But... but there's not a question here. You can actually see the difference between regular vsync and vsync+triple buffering. This isn't some hypothetical situation and you sure as shit don't need a framerate counter.

But honestly, the technology is going to to do very little for someone who doesn't notice tearing in Uncharted, which is unfortunately very obvious to me. You probably don't even care about vsync in the first place while I enable it in almost every game I can.
 

Slavik81

Member
brain_stew said:
I find that really hard to believe , I think its more that they ignore it and deal with it. We're talking about a situation where someone is blocking out 50% of the visual information fed to them. How in the hell can you play a video game if you only react upon or notice 50% of the visual data on screen? Its the equivalent of playing a game with your eyes squinted.
Two reasons why it's not immediately apparent:

1. Unlike what you suggest, 50% of the visual information is not missing. If the undrawn parts were black or something, it would be much more obvious, but just having the old frame in its place is not nearly as noticeable. How noticeable it is depends on where the break is (if it's at the edges of the screen, it will be less noticeable), and depends on how much movement there was on things that intersect the break line.

2. It may only be on screen for a single frame. Picking out minor discrepencies that appear for only 17 to 33 miliseconds is hard. Unless I sit there and try to look for it, all I see is a flicker.

I'm sure with training, anyone could see it. And perhaps they unconciously also are affected by it. But I've tried to show obvious tearing to people before, and have had great difficulty getting them to see it. I think that's common.
 

Grayman

Member
I really need to change my avatar so I am not associated with the killzone crazy club, i haven't even played the game in months
 
Slavik81 said:
Two reasons why it's not immediately apparent:

1. Unlike what you suggest, 50% of the visual information is not missing. If the undrawn parts were black or something, it would be much more obvious, but just having the old frame in its place is not nearly as noticeable. How noticeable it is depends on where the break is (if it's at the edges of the screen, it will be less noticeable), and depends on how much movement there was on things that intersect the break line.

2. It may only be on screen for a single frame. Picking out minor discrepencies that appear for only 17 to 33 miliseconds is hard. Unless I sit there and try to look for it, all I see is a flicker.

I'm sure with training, anyone could see it. And perhaps they unconciously also are affected by it. But I've tried to show obvious tearing to people before, and have had great difficulty getting them to see it. I think that's common.

Well I'm pointing to recent examples that have been recorded to have around 50% torn frames here, and most of it slap bang in the middle of the screen. Ghostbusters, Red Faction, Sacred 2 and Bionic Commando are all perfect examples. My comment about the visual information being "missing" was indeed totally off, wrong use of words, I just tend to disregard a frame with bad tearing as garbage data, to me its a lost frame, because the image is totally incorrect.

When its just the odd torn frame here and there, and not high percentage of frames of a period of time, yeah I can totally understand people missing it, though all recorded data would suggest that they are seeing it, I guess their brain tries to blend it in with the previous and next frame or something. I don't know, I'm just guessing because its so clear as day to me, its hard to grasp how people don't notice it.

I mean, whilst they may not know what's wrong, I struggle to believe anyone could watch one of these two videos and not think something was "off" about the motion.

http://www.eurogamer.net/videos/digitalfoundry-sacred-2-1080p-performance-analysis?size=hd

http://www.eurogamer.net/videos/digitalfoundry-red-faction-guerilla-frame-rate-analysis?size=hd

I mean those two videos make me feel physically ill, the motion is so warped due to the tearing.

Honestly, I'm quite interested in how others perceive the world around them, and how their brain rationalises stuff like that. They way I perceive these things its so very hard to comprehend someone's mind compensating for the inconsistencies, though I'm not going to rule that out as a possibility, its just hard to come to terms with when you've always seen things one way.


Grayman said:
I really need to change my avatar so I am not associated with the killzone crazy club, i haven't even played the game in months

I admit my comments wear off there mind, I've just had some silly run ins with that bunch in the past, and the refusal of a poster to even read up the research presented to him time and again, really wore me thin. It was a moment of weakness.
 

Truespeed

Member
epmode said:
But... but there's not a question here. You can actually see the difference between regular vsync and vsync+triple buffering. This isn't some hypothetical situation and you sure as shit don't need a framerate counter.

But honestly, the technology is going to to do very little for someone who doesn't notice tearing in Uncharted, which is unfortunately very obvious to me. You probably don't even care about vsync in the first place while I enable it in almost every game I can.

Actually, there is a question and it pertains to why the technique hasn't been universally adopted by the best engine developers in the world - or even John Carmack for that matter. The pathetic response I received was to call their technology and decisions 'shitty'. Here's a question, how would triple buffering fare on a Crossfire / SLI setup? I'm guessing not well.
 
Truespeed said:
Actually, there is a question and it pertains to why the technique hasn't been universally adopted by the best engine developers in the world - or even John Carmack for that matter. The pathetic response I received was to call their technology and decisions 'shitty'. Here's a question, how would triple buffering fare on a Crossfire / SLI setup? I'm guessing not well.

Carmack's engine's have always been openGL based, and tripe buffering has been supported in all openGL games through the driver since forever. Pretty sure his engine have had it built in anyway, RTCW sure as hell had a triple buffering option, nearly a decade ago.

Read the fucking Anandtech article for christ's sake, it explains it all, I'm not making this shit up, you're arguing against provable fact.

Triple buffering won't work with a dual SLI/Crossfire config (for obvious reasons) but since AFR is a fundamentally broken approach to GPU scaling anyway its academic.
 

Grayman

Member
I don't think I could even make q4 double buffer. Vsync would screw up the physics in all of them(without community changes) but Quake Live anyways.
 

dLMN8R

Member
It's pretty sad when someone spends more time posting inaccurate BS in a thread than it takes to read the damned article this entire thread is about in the first place.
 

Dural

Member
I was just reading this interview with Criterion where they talk about their technology and one of the questions was if Paradise was triple buffered. Their response was that it was double buffered for minimal lag. So some developers aren't willing to add the lag that is introduced with triple buffering.
 

Ding

Member
Buffering merely trades one issue for another. It's hardly a panacea. Some people may like it however. Depends whether you prefer judder and latency, as opposed to tearing.

In "fast" games, I myself can live with some tearing. In slower games, sure, sync that sucker up.
 

epmode

Member
Dural said:
I was just reading this interview with Criterion where they talk about their technology and one of the questions was if Paradise was triple buffered. Their response was that it was double buffered for minimal lag. So some developers aren't willing to add the lag that is introduced with triple buffering.
This is why PCs are so rad. <3 options <3
 
Truespeed said:
The tech leads have done their analysis and have basically come to the conclusion the benefits do not overcome its detriment to frame rate and other performance factors.

The 'detriment' of enabling triple buffering is basically increased video memory needs. You're acting like developers don't include triple buffering as an option because of some drastic performance or gameplay hit. Yet you obviously ignore all the users that enjoy the benefits of triple buffering and you even ignored (or couldn't correctly comprehend) the article at hand explaining how it works. Also, regarding your "developers will always do what's best" comment, you have to remember that even some modern PC games leave out basic options to enable/adjust anti-aliasing and support for widescreen resolutions (or proper FOV settings for those resolutions) basically destroying your entire argument that developers will always include options to make their game look the best for each user.
 
The thing is, there really isn't any increased latency. The only time you'll get more recent visual information with standard double buffering is when the frame is tearing, and then, only the information below the tear will be more recent. In all other circumstances latency should be the same. Honestly, it really is the best solution in the vast majority of cases, its really hard to back up an argument against it as far as I'm concerned.

Please read the article, and in particular the comments beneath it, this is all explained very well.
 

Blizzard

Banned
dLMN8R said:
It's pretty sad when someone spends more time posting inaccurate BS in a thread than it takes to read the damned article this entire thread is about in the first place.
The thing is, he claimed to have read it, or at least he looked at it enough to say that the comments posted to it show that it's inaccurate or more complicated or whatever.
 
Top Bottom