I did my best to condense the question into something that would fit into the title but all nuance is lost there so I would ask that you read the OP. This is not a debate about consoles vs PC's or even an attempt to ignite some fanboy war this is legitimate curiosity on my part. I've attempted to google this myself on multiple occasions both out of spite to someone, in attempts to win an argument, and just randomly out of curiosity. I've thought about making threads on it as well but I was honestly scared it would just turn into people insulting me over even daring to ask a question that seems to be common sense and deriding me as some form of ignorant fanboy.
Let's start out with 60fps looks objectively better than 30fps anyone with a pair of eyes can see this when both are put together even someone as insensitive to it as I am. I am in no way arguing 30fps should be considered as good as 60fps in any scenario if 60fps is an option it looks objectively better. The human eye can tell a difference in FPS even above 60fps so it's not a matter of if we can see it or not
So to the question at hand, while on the surface this sounds simple I feel like it is a bit more nuanced than "more frames means you perform better". What I want to know is when comparing a game running at 30hz to a game running at 60hz do the extra frames amount to just eye candy or does it translate to objectively better performance on the part of the player? As far as I know the average human reaction time is well well below the difference between individual frames of 30fps. I see the input delay argument a lot but with human reaction time being at .25 seconds for visual stimulus it's equivalent to about 8 frames at 30fps or 16 at 60 fps. I've thought this over and I really can't comprehend how human reaction time can possibly get better after the flicker fusion threshold when an image starts being interpreted as connected motion by our eyes. However, I could obviously be wrong and the way I'm connecting the dots could be wrong that's why studies are important. Which is why I've attempted to look up data. I've tried googling questions similar in content to "How do gamers do in identical scenarios when running at different (assumed to be locked) framerates" among other things. However, most of what comes up in my research is just old posts that amount forums users bickering about things like whether the human eye can see higher than 30fps (it can). The best example I managed to find to an in depth analysis on the subject comes from Eurogamer and their conclusion is basically we don't know but frame pacing and input lag make a bigger difference than framerate.
Other than that there's a few random things I've found around the internet such as a quip in a PC gamer article about games not playing better above 20fps however there wasn't a lot backing that up.
So with that somewhat lengthy breakdown out of the way. Are there any studies on this? I have no issues with personal opinions on the matter or anecdotal evidence but I'm more curious if there is any hard data on gamer performance above 30fps. Thanks in advance for any answers and I once again plead for tame responses.
Let's start out with 60fps looks objectively better than 30fps anyone with a pair of eyes can see this when both are put together even someone as insensitive to it as I am. I am in no way arguing 30fps should be considered as good as 60fps in any scenario if 60fps is an option it looks objectively better. The human eye can tell a difference in FPS even above 60fps so it's not a matter of if we can see it or not
So to the question at hand, while on the surface this sounds simple I feel like it is a bit more nuanced than "more frames means you perform better". What I want to know is when comparing a game running at 30hz to a game running at 60hz do the extra frames amount to just eye candy or does it translate to objectively better performance on the part of the player? As far as I know the average human reaction time is well well below the difference between individual frames of 30fps. I see the input delay argument a lot but with human reaction time being at .25 seconds for visual stimulus it's equivalent to about 8 frames at 30fps or 16 at 60 fps. I've thought this over and I really can't comprehend how human reaction time can possibly get better after the flicker fusion threshold when an image starts being interpreted as connected motion by our eyes. However, I could obviously be wrong and the way I'm connecting the dots could be wrong that's why studies are important. Which is why I've attempted to look up data. I've tried googling questions similar in content to "How do gamers do in identical scenarios when running at different (assumed to be locked) framerates" among other things. However, most of what comes up in my research is just old posts that amount forums users bickering about things like whether the human eye can see higher than 30fps (it can). The best example I managed to find to an in depth analysis on the subject comes from Eurogamer and their conclusion is basically we don't know but frame pacing and input lag make a bigger difference than framerate.
So the question is, should console titles be allowed to operate at their absolute fastest? Or should performance be capped in order to enforce the kind of consistency that Paul Rustchynsky talks about?
The short response is that there is no definitive answer. Different games target different experiences with different priorities, and gamers themselves have their own personal opinions on what works best. However, by looking at key titles, we can build up a picture of what works for us, which perhaps puts some of our tech analysis pieces on specific games into context.
Other than that there's a few random things I've found around the internet such as a quip in a PC gamer article about games not playing better above 20fps however there wasn't a lot backing that up.
Hes not saying that we cant notice a difference between 20 Hz and 60 Hz footage. Just because you can see the difference, it doesnt mean you can be better in the game, he says. After 24 Hz you wont get better, but you may have some phenomenological experience that is different. Theres a difference, therefore, between effectiveness and experience.
So with that somewhat lengthy breakdown out of the way. Are there any studies on this? I have no issues with personal opinions on the matter or anecdotal evidence but I'm more curious if there is any hard data on gamer performance above 30fps. Thanks in advance for any answers and I once again plead for tame responses.