borghe said:
and again, this is all 100% bullshit. the video signal generator grabs whatever frame happens to be in the framebuffer. if the frame is updated or not is absolutely irrelevant to the video signal generator. is the GPU drawing the pause screen at 60fps or 30fps? no, it draws the pause screen and then issues a wait function for input. no further images are sent to the framebuffer and the same frame is just continuously grabbed until updated (as an example). you essentially say that the framebuffer can indeed hold the same frame for multiple refreshes for the signal generator to grab but yet insist that it can only do it for divisible numbers of frames. you offer no technical explanation why (because there isn't one)
There isn't one ? are you kidding ?
You see, you probably got this wrong because you're thinking of the average framerate in a second. You can obviously have 45 different frames displayed within a second, but these frames will either last 1/60 of a second or 1/30 (or 1/20 and so on) in no way you can have a frame lasting 1/45 of a second. If the average framerate in a second is 45, this means you had 15 frames lasting 1/30 seconds on screen and 30 lasting 1/60.
This means in a second you had a part of the animation being 60fps smooth and part being 30fps smooth and you'll perceive the 30fps bits as huge slowdowns.
The photon gun makes 2 passes every 1/30 second, it will take roughly 0.016 seconds for each pass. If the gpu can send 2 different frames within this 'time limit' then you'll have a 60 fps smooth 2 frames animation (NOTE that here, when i say 60fps i don't mean you'll have 60 different frames in the entire second - i am just talking about these 2 frames, what happens for the remaining part of the second is a different story and can lead to any kind figure for fps average)
If either one of these frames is late, it will have missed the photon gun pass and will have to wait for the next: this means it will stay on screen for 2 passes: that is 1/60 x 2. 2/60 or 1/30 if you prefer. Which is exactly how long a frame remains on screen in a 30fps animation.
So, below 60fps you can only have a 30fps smooth animation (or less), not something in between: again i am not talking about how many different frames will actually be displayed in a second, i am talking about the smoothness of this 2 frames sequence.
I think it's obvious now why it has to be 60/2 and not 60/1.5. We're talking how long these frames will have to stay on screen; if it's longer than 1/60 then it has to be 1/30 because that's the soonest available photon gun pass.
in this case, 2 represents the number of photon gun passes between the screen refreshes; obviously it can't be 1.5 passes, because this would mean having the second frame being drawn when the photon gun has done one and half pass: the frame will be drawn when the gun is already halfway down the screen = tearing.
Either a new frame is displayed when the gun starts painting the screen or it isn't. Either it takes 1 pass to be drawn or it takes 2 or more; it can't take one and half pass, i am sure you'll agree. That's why you can either have a 60fps smooth animation or a 30 (20, 15etc) fps one.
Again, notice that i am using "frames per second", but this doesn't mean the refresh rate must be the same for the entire second. Just accept that within a second, the animation can either be 60fps smooth or 30fps smooth (or 20, or 10 etc); of course you can also have 45 different frames in a second, but that would only mean you had 30frames being drawn at 60fps and 15 being drawn at 30; and your eye won't perceive that as a 45fps smooth animation, it will perceive the sequence for what it is: bits running at 60fps and bits running at 30.
now mind you I am not arguing here that 30 or 60fps don't look best, and I am not arguing that somethign like 45fps wouldn't have mild jerky issues on pans like 24fps encoded DVDS have when pulled down to NTSC,.. What I AM arguing is the utterly moronic logic that a video GPU is only able to render frames in evenly divisible quantities based on what the scan rate is"
But of course not ! that's not what i'm saying at all ! a GPU can do whatever it is able to do, it can render at 117 frames per second, or 34 or 76 or 13. Scan rate doesn't directly affect GPU rendering times, of course, BUT if you want your game not to look jerky, with serious tearing issues etc. you will have to somehow synchronize the GPU output with the tv scan rate.
If this were true there would be a fairly dramatic difference on my PC in framrate using a fps counter when switching from say 85Hz and 100Hz.. and there isn't. I am still rendering at the same framerates in a game the only difference being the stability of the screen flicker.
Of course there's no difference in rendering times, but there should be a difference in the way it's displayed
But this retarded stance that you have to have evenly divisble frames makes no sense. There is no difference between running at 60fps, 30fps, 49fps, or 17fps.. I agree completely that running at evenly divisible framerates looks best, but from a technical standpoint it is required nowhere to be the case and arguably in many cases IS NOT the case which is what leads to framerate problems.
I think after the explanation you'll agree it's not retarted...it's just simple logic.
And it does indeed lead to framerate problems; If the GPU outputs 54 frames in a seconds and v-sync is enabled, 6 of those 54 frames will be displayed as a 30fps animation and the remaining 48 as a 60fps animation. And you don't want that.
An awful lot.
is it impossible for the 360 to do 45fps? not on your life.
X360 can do whatever it wants, of course.
Still a 45fps output will be displayed as bits at 60fps and bits at 30fps: and trust me, you don't want to play something like that (remember Sega Rally 2 on dreamcast ?).