hyperlogic18
Neo Member
Hmmm, i can't really tell much difference, certainly not enough to allow it significantly affect me enjoying the game.
Can you give me an example of how this ability to perceive the differences between framerates raises your enjoyment of games versuses someone who cannot tell the difference? Honest question. Maybe I am missing something.
Every game when started up for the first time should ask, "Can you tell the difference between 30 and 60 fps? Be honest.", and then set the game's visuals accordingly FOREVER (the player has to re-buy the game to change this setting).
(Each player's response is then sent to a server and tallied, with the results posted on GAF.)
You don't see 60fps, but you certainly feel it..
It would be interesting to have an anonymous tally around these parts all the same, though I'd imagine I would definitely end up in the minority.
You don't see 60fps, but you certainly feel it. Controls, especially the mouse (or thumb sticks), are so much more responsive. When you're at 60 or more fps, there is a good chance you won't see a huge dip when there is a ton of explosions and effects on your screen. In recent console 1st person games I've had a hard time aiming in hectic situations with low fps, BF3 is my probably best example. A big reason why CoD is so popular is because how smooth and good it feels to play at a solid 60fps.
I'm not elitist, but I'm one of the ones who can't even play through GTA V or TLOU because of frame rate. In 3 years with a high end pc the only game I've ever capped at 30 was Crysis 3 with everything maxed. And even then I preferred 900p with a 50-60 variable rate by a long shot. It's amazing what you lose in responsiveness, and it's so much easier on the eyes. Honestly, if BF4 on PS4 isn't at least 900p60, I'll just skip it.
I also disagree about slower 3rd person games being ok at 30. Maybe if they actually stayed at 30 with the exception of multiple explosions, but they don't. Console games fluctuate between 24 and 30, and that shouldn't be acceptable imo.
Only game I've played on PC with a locked 30fps was dark souls cause the game is designed around 30fps and in fact could not be played above 30fps without making the game just play faster.
So wrong. Durante's DSFix doesn't speed up the game.
You can get a perfect video encode, but what your video encodes gets distorted. The source is 60FPS, thats what the original video is at. What they pretty obviously did is take that same video, process it in some manner that I can't quite pinpoint that drops every other frame from the video and publish it. You effectively lose every other frame of data. Keep in mind that the video spans the same amount of time.
So instead of playing the frames
1
2
3
4
5
It plays
1
3
5
7
9
1 2 3 4 5 6 7 8 9 (60 fps)
1 3 5 7 9 (30 fps)
So wrong. Durante's DSFix doesn't speed up the game.
I'm not sure I get what you're saying - that's pretty much exactly what happens with 30 fps (compared to 60 fps). Your example should be like this:
Code:1 2 3 4 5 6 7 8 9 (60 fps) 1 3 5 7 9 (30 fps)
Dropping every other frame from a 60 fps video of recorded vsynced gameplay would give you the exact same results as 30 fps vsynced gameplay (provided there are no dropped frames in 60 fps video).
No, it doesn't. Again, it drops whole frames of data, which means it skips from frame to frame creating extra choppiness. A game would render frames 1 2 3 4 5
I prefer 30FPS locked because I really dislike 60FPS when it dips below it.
Doesn't make sense when I think about it, but for me I prefer a consistent rate over an unreliable 60.
It makes perfect sense, I could say the same for you.That doesn't make any sense, I'm not sure you understand what I'm saying.
It makes perfect sense, I could say the sme for you.
1 2 3 4 5 6 7 8 9 ... (60 fps)
1 1 3 3 5 5 7 7 9 ... (30 fps)
It makes perfect sense, I could say the same for you.
Ok, I'll try to make it even more clear.
Your TV displays 60 fps regardless of the framerate in the game.
To expand on the example above. These 9 frames would be 150 ms in-game (9/60). If the game is running at 30 fps vsynced, every frame would have to be doubled to match the refresh rate of the TV:
Code:1 2 3 4 5 6 7 8 9 ... (60 fps) 1 1 3 3 5 5 7 7 9 ... (30 fps)
Taking a 60 fps video and dropping every other frame creates the same result (with each frame repeating once of course, otherwise the video would be sped up with a factor of 2).
We're talking about two entirely different things . You're taking about the video, I'm talking about the source. Your post lays it out clearly, a native renderer doesn't repeat frames. It's a flawed comparison.
Contrary to popular belief, standards change over time.
Are you playing all your games at 480p because it used to be acceptable?
It really isn't. A native renderer locked at 30 fps vsynced, shown on a 60 fps display, does repeat frames - as in, it will show the same frame buffer for two consecutive frames on the display.
It's not really about standards, it's about getting the best possible gameplay experience. Games in 480p or lower are fine because the low resolution doesn't affect the gameplay in any way. When I'm playing a PC game I'll lower the resolution as much as I need to in order to get 60 fps.
We're talking about raw framebuffer grabs here, a renderer doens't render a frame twice. That applies no matter what it gets displayed at.
Buffer swaps at 60 fps
|
_________________
| | | | | | | | |
* * * * * * * * *
1 2 3 4 5 6 7 8 9 ... (60 fps vsynced)
1 1 3 3 5 5 7 7 9 ... (30 fps vsynced)
* * * * *
|___|___|___|___|
|
Buffer swaps at 30 fps
No, of course a renderer doesn't render a frame twice, I never said it did. You said the video was shit because they took a 60 fps video and cut the frame rate in half to get 30 fps, and that dropping frames resulted in extra choppiness (but you never explained why).
I'll give one final example of why they're the same.
Code:Buffer swaps at 60 fps | _________________ | | | | | | | | | * * * * * * * * * 1 2 3 4 5 6 7 8 9 ... (60 fps vsynced) 1 1 3 3 5 5 7 7 9 ... (30 fps vsynced) * * * * * |___|___|___|___| | Buffer swaps at 30 fps
Take a video recording of the above 60 fps, cut the frame rate in half, then double the frames and you get the exact same frame sequence as the 30 fps output.
Devs can add motion blur to make 30 fps games look less choppy and if the engine applies different amounts of motion blur depending on framerate then a comparison like this would be invalid of course (I have no idea any engine do this on PC).
Oh, definitely, when it comes to racing/action games 60 FPS is a better choice. Slower games can be played perfectly fine at 30 FPS, and in many cases the trade-off could give you better results, even though 60 FPS would still be ideal if we had unlimited performance.30fps for racing games is serviceable yeah but it definitely isn't ideal.
When it comes to racing games and you have a choice between 30fps and supped graphics or 60fps and plainer, I think there's a bigger benefit in hitting 60.
20fps looks bad enough to ruin a game.
Its obvious, no? Its not the sequential frames. It jumps to every other frame, you're missing half the data. 30FPS is half the data inherently yes, but its still rendered sequentially. Have you still not looked at the videos? In the OP video, when it goes from 60 to 30, the game(video) starts to move in slowmotion almost and chops from frame to frame. In my youtube video(native 30) its the same speed roughly, but markedly less smooth.
Please tell me that people aren't making a comparison based off watching this directly from there player on their page? That looks like flowplayer which I believe is still capped at 24fps due to Adobe's archaic standards. If you want to see what it actually looks like there is a 60fps h264 mp4 file you can grab within the page source. Put that in to something like VLC or MPC HC.