yeah, it's been sitting at 1 for a while now.brain_stew said:Have you tried altering the prerendered frames setting in your drivers?
maus said:yeah, it's been sitting at 1 for a while now.
brain_stew said:You tell me? If a single hobbyist can create a program for free that forces it in over 90% of games, then, yes its something that can easily be added to the majority of games.
Wait, wat?Truespeed said:The very reason that triple buffering is ignored is the answer. Meta isn't it? The tech leads have done their analysis and have basically come to the conclusion the benefits do not overcome its detriment to frame rate
epmode said:Wait, wat?
Truespeed said:The very reason that triple buffering is ignored is the answer. Meta isn't it? The tech leads have done their analysis and have basically come to the conclusion the benefits do not overcome its detriment to frame rate and other performance factors. However, it would it be nice if they added it as a disabled by default option to appease those that are comfortable with reduced fame rates (a phenomena more people notice than screen tearing).
Jesus....of all the times in this thread you've proven that you don't know a damn thing about what you're talking about, this one takes the cake :lol :lolTruespeed said:If your double buffered game is rendering at 60fps what impact do you think rendering a third buffer would have?
Truespeed said:If your double buffered game is rendering at 60fps what impact do you think rendering a third buffer would have?
It shouldn't have a large effect. My understanding from the article was that your frames hit the two backbuffers nearly as regular rendering, there are no "extra" frames being rendered here, as long as render time is <16ms you are not losing performance.Truespeed said:If your double buffered game is rendering at 60fps what impact do you think rendering a third buffer would have?
The difference between Double and Triple-buffering has absolutely NO inherent effect on performance. It does not tax the video card any more to "double" or "triple" buffer a game. The only difference whatsoever is roughly 30MB of video memory or so, enough to store that extra frame buffer. On a video card with 512MB or more, that's negligeable.Grayman said:It shouldn't have a large effect. My understanding from the article was that your frames hit the two backbuffers nearly as regular rendering, there are no "extra" frames being rendered here, as long as render time is <16ms you are not losing performance.
Grayman said:It shouldn't have a large effect. My understanding from the article was that your frames hit the two backbuffers nearly as regular rendering, there are no "extra" frames being rendered here, as long as render time is <16ms you are not losing performance.
dLMN8R said:The difference between Double and Triple-buffering has absolutely NO inherent effect on performance. It does not tax the video card any more to "double" or "triple" buffer a game. The only difference whatsoever is roughly 30MB of video memory or so, enough to store that extra frame buffer. On a video card with 512MB or more, that's negligeable.
But then...if people actually read the Anandtech article...everyone would know this already.
brain_stew said:If you're going to follow this line of thinking then why the hell does any game include a double buffer v-sync option then? Nigh, on every game out there has an option to enable v-sync ingame, but according to you logic this is surely not possible? Triple buffering is about giving better performance and less lag for those that want to get rid of tearing, not less.
The reduction in actual performance between a standard double buffered game and a triple buffered v-synced game is very small, that's the whole freaking point of the method, to enable vertical synchronisation without the costly performance and input lag cost of bog standard double buffer v-sync.
What in the hell is wrong with giving the end user options? This is PC gaming, that's what its all about about. Plenty of games don't include antialiasing support (and no, I'm not talking about DX9 games that use deferred rendering, they have an excuse even if it can be overcome) or widescreen support (or at least they never use to) even if their engine is fuly capable of it. Its just shitty practice, and nothing to do with tech leads analysing performance factors, its developers pushing games out the door without basic functionality that can very simply be added but aren't.
Truespeed said:I agree totally with the point of giving your users options to tweak the rendering of the game to a level that's acceptable to them. Giving the user options is never a bad thing. The issue I have is that if the process was such a proven solution to alleviate screen tearing then we would have seen this technique used in all games. Developers want their games to look the best they can possibly look. Do you think they're actually happy with shipping a game that has screen tearing or do you think they just don't care about it and are too lazy to come up with a solution? The only solution to screen tearing is to not render the new screen until the previous is fully displayed. Depending on the complexity of what they're trying to render this will reduce the frame rate and introduce lag. And the lag is considerably more perceivable than screen tearing that I maintain is not to the majority of people. It's one or the other and the technical leads have obviously favored blasting out as many frames as they can regardless of vsync.
brain_stew said:How in the hell can you guarantee your game runs at 60fps all the time on an open platform? The answer? You can't. What do PC games use to cap their maximum framerate? Yup its double buffer v-sync, why not give users the option to use tripple bufering so any dips are much less notcable?
How many console games are actually a locked 60fps these days? Its certainly less than 5%, so its hardly an example to set a rule by now is it?
The entire point of the option is to INCREASE framerate over standard vsync with no tearing. And it works. Jesus Christ.Truespeed said:If your double buffered game is rendering at 60fps what impact do you think rendering a third buffer would have?
Grayman said:I was going to play around with this myself but I hardly notice tearing, i think i've seen it in uncharted once or twice when trying to make it happen. I was going to do the lag test but I could not even get quake 4 to double buffer(or go below 60fps at all when I had vsync on) and did not feel like playing game setting carousel more. At 60hz, would 125fps tear at the top or bottom?
I thought Uncharted 2 was using Triple Buffering?Truespeed said:By specifying your PC game has a minimum specification of CPU and GPU performance like most games do nowadays? As for how many console games are locked at 60fps, well not many because the sacrifices you need to make to achieve 60fps are considerable unless, of course, you have the coding prowess of a company like Criterion games that understands that art assets, geometry and code need to be architected to support the target framerate. But, bringing consoles into this is rather moot because given the current RAM footprint frustrations there is no way they would even contemplate introducing a third buffer to eat spare memory they don't have.
Truespeed said:By specifying your PC game has a minimum specification of CPU and GPU performance like most games do nowadays? As for how many console games are locked at 60fps, well not many because the sacrifices you need to make to achieve 60fps are considerable unless, of course, you have the coding prowess of a company like Criterion games that understands that art assets, geometry and code need to be architected to support the target framerate. But, bringing consoles into this is rather moot because given the current RAM footprint frustrations there is no way they would even contemplate introducing a third buffer to eat spare memory they don't have.
MvmntInGrn said:I thought Uncharted 2 was using Triple Buffering?
Of course it is in the vast minority though. :lol
epmode said:The entire point of the option is to INCREASE framerate over standard vsync with no tearing. And it works. Jesus Christ.
BTW, I immediately catch on to screen tearing, yet I can easily deal with any additional input lag caused by triple buffering in 90% of everything I've tried.
Truespeed said:Well, a citation would be neat, but if anyone could do it it's ND. The one criticism that detractors would always use to put down Uncharted 1 was the screen tearing. I played though the game and never really noticed it, though.
Truespeed said:Great. Have you given any thought to sharing your discovery with engine developers? Do you think you've stumbled upon something they have no knowledge or idea of? Why do you think they haven't implemented this and why this technique isn't pervasive in all games?
maus said:i just tried some tf2 with trip buffering and the input lag was maddening. the source engine seems to be worse than most other engines with this type of thing; the original half-life is not nearly as bad..
at least it lets me cap the fps in console because it tears like crazy when you start getting above 60fps.
Truespeed said:Haven't you been reading this thread? Triple buffering is supposed to increase your framerate :lol
Truespeed said:While we're at it - I think it's time to re-open the investigation in all 19 of those PS3 / 360 face-off's. I clearly think there was manipulation, payola and major bias involved.
brain_stew said:Developers make shitty technology decisions all the time. The fact that sub HD resolutions and unstable 30fps framerates are par for the course this gen. should be more than enough to tell you that.
Fuck, Ether Snake was describing to me the other day how many of his fellow graphics programmers didn't even understand the concept and benefits of ambient occlusion ffs. I'm a casual end user and even I grasp that, being in a job at a game developer doesn't mean you have a perfect understanding and grasp of every single rendering technique nor should it. The fact that, the two most technically competent console development houses around seem to be using triple buffering in their PS3 work should tell you that its a technique that has merit.
Man, if only I could show you the difference in Trine on my computer. It's entirely reproducible, too.Truespeed said:Great. Have you given any thought to sharing your discovery with engine developers? Do you think you've stumbled upon something they have no knowledge or idea of? Why do you think they haven't implemented this and why this technique isn't pervasive in all games?
brain_stew said:Evidently you haven't. Fuck, I give up, if you don't have basic reading comprehension then I'm just wasting my time.
epmode said:Man, if only I could show you the difference in Trine on my computer. It's entirely reproducible, too.
Truespeed said:Dude, don't take it personal. I just disagree with you. Deal with it and try not to make it personal. Now, build me a system for $500 that gets 60fps in Burnout.
brain_stew said:Evidently you haven't. Fuck, I give up, if you don't have basic reading comprehension then I'm just wasting my time.
Edit: God damn it, why didn't I check this guys post history before getting into this! :lol if I'd have know he was part of the PS3/KZ2 crazy brigade I would have known not to have bothered. Any fucker that comes out with this gem deserves to be laughed at in a public setting, and all I was trying to do was have a reasoned technical debate as well as provide help to GAF's PC gamers.
brain_stew said:No, you disagree with well researched, and mathematically proven evidence. There's a difference here. And, um, my $500 config will do that (probably at 1080p even) just fine.
Truespeed said:This is fucking bullshit. I just bought MT Framework 1.0 and now they're releasing MT Framework 2.0 so soon? This is the last time I buy a framework from Capcom. They're fucking bifurcating the community with this money grabbing stunt. I'll post a link to a online protest page when I create it so that we can send Capcom a message.
But... but there's not a question here. You can actually see the difference between regular vsync and vsync+triple buffering. This isn't some hypothetical situation and you sure as shit don't need a framerate counter.Truespeed said:This has become silly. If you really think triple buffering is the panacea to screen tearing then more power to you. We'll just have to disagree.
Two reasons why it's not immediately apparent:brain_stew said:I find that really hard to believe , I think its more that they ignore it and deal with it. We're talking about a situation where someone is blocking out 50% of the visual information fed to them. How in the hell can you play a video game if you only react upon or notice 50% of the visual data on screen? Its the equivalent of playing a game with your eyes squinted.
Slavik81 said:Two reasons why it's not immediately apparent:
1. Unlike what you suggest, 50% of the visual information is not missing. If the undrawn parts were black or something, it would be much more obvious, but just having the old frame in its place is not nearly as noticeable. How noticeable it is depends on where the break is (if it's at the edges of the screen, it will be less noticeable), and depends on how much movement there was on things that intersect the break line.
2. It may only be on screen for a single frame. Picking out minor discrepencies that appear for only 17 to 33 miliseconds is hard. Unless I sit there and try to look for it, all I see is a flicker.
I'm sure with training, anyone could see it. And perhaps they unconciously also are affected by it. But I've tried to show obvious tearing to people before, and have had great difficulty getting them to see it. I think that's common.
Grayman said:I really need to change my avatar so I am not associated with the killzone crazy club, i haven't even played the game in months
epmode said:But... but there's not a question here. You can actually see the difference between regular vsync and vsync+triple buffering. This isn't some hypothetical situation and you sure as shit don't need a framerate counter.
But honestly, the technology is going to to do very little for someone who doesn't notice tearing in Uncharted, which is unfortunately very obvious to me. You probably don't even care about vsync in the first place while I enable it in almost every game I can.
Truespeed said:Actually, there is a question and it pertains to why the technique hasn't been universally adopted by the best engine developers in the world - or even John Carmack for that matter. The pathetic response I received was to call their technology and decisions 'shitty'. Here's a question, how would triple buffering fare on a Crossfire / SLI setup? I'm guessing not well.
This is why PCs are so rad. <3 options <3Dural said:I was just reading this interview with Criterion where they talk about their technology and one of the questions was if Paradise was triple buffered. Their response was that it was double buffered for minimal lag. So some developers aren't willing to add the lag that is introduced with triple buffering.
Truespeed said:The tech leads have done their analysis and have basically come to the conclusion the benefits do not overcome its detriment to frame rate and other performance factors.
The thing is, he claimed to have read it, or at least he looked at it enough to say that the comments posted to it show that it's inaccurate or more complicated or whatever.dLMN8R said:It's pretty sad when someone spends more time posting inaccurate BS in a thread than it takes to read the damned article this entire thread is about in the first place.