• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Have there been any definitive studies on gamers "playing" better above 30fps.

Wulfram

Member
Then there's something wrong.

- Your monitor/TV is broken
- You don't save the changes you make in the drivers so the game runs at the same fps, even if you think you changed it
- Your card is always locked at the same fps no matter what you do
- Your FPS counter is bugged and reports wrong estimates while your game always runs at the same fps
- You get frame pacing and stuttering instead of consistent frame rate
- You think frame rate is something else and even though you feel the difference, you can't put your finger on it
- You have some rare eye condition

Has to be one of those or a combination of more. There is no other way for a person to not be able to notice the difference between 30 and 60 fps.

To be clear, I can tell the difference in "experimental conditions". Having refined my switching technique so I can switch FPS with less screwing around, I can now actually see a difference in game.

What I can't do is tell a difference if it's not instantaneous, and I certainly can't if I'm not looking out for it.
 

Izuna

Banned
Don't play the game myself, but CSGO players seem to think it helps them get an edge with reaction times and precision aiming.

For me, it just makes the games smoother. I found in Prey that tracking enemies at 90fps was really easy to do and I could pull off quick shots on lunging enemies that would have been difficult at lower framerates due to the chaos that can happen on screen.

Did you have a high refresh rate display? Because otherwise you just have less input lag
 

New002

Member
It's objectively better from a technical standpoint. But give the nature of it the impact is gonna vary from person to person. It's a weird one.
 

GermanZepp

Member
http://www.cs.uml.edu/~kajal/research/pubs/mmcn06-frameRate/fr-rez.pdf

Some interesting paper on the topic.

Edit:
This paper presents results of a carefully designed user study investigating the effects of frame rate and frame resolution
on users playing a first person shooter (FPS) game. A custom map was designed to allow repeated testing of the core aspect
of FPS play – aiming and shooting at an opponent. A test harness was developed to first collect demographic data for each
user, and then cycle through the custom map with different frame rates and frame resolutions, collecting user perceptions
each time. Sixty users participated in the study, providing a large enough base for statistical significance for most of the
data analyzed. Analysis shows the effects of frame rate and frame resolution to be remarkably different for computer games
than for streaming video and other interactive media. In particular, for computer games, frame rate has a pronounced effect
on user performance, while resolution does not. Both frame rate and frame resolution, however, impact user perception of
game picture quality.
 

DaciaJC

Gold Member
From personal experience: I feel like I performed better when playing in Factions in TLoU: Remastered (60 fps) compared to the 30 fps PS3 version. First time booting up the remaster, the aiming felt significantly easier and smoother.
 
I had a higher ranking with 100ms ping and a worse computer in League of Legends than I do now at 30ms ping and 100fps so in my incredibly large sample size I'll say playing with worse graphics makes you better.
 

Brashnir

Member
A game that requires tight timing windows will be easier to play at higher framerates. (input lag is also an issue for these types of games, but we'll only discuss the effect of framerate on input delay for the purposes of this discussion.)

Let's take a game like Mike Tyson's Punch Out on NES as an example.

Early fights have very generous timing windows, giving the player plenty of time to react to on-screen triggers. However, later fights have much shorter timing windows. I don't know what the exact windows are to dodge, say, Mr. Sandman's Dreamland Express, but for the purposes of this discussion, let's use a made-up-but-probably-not-that-far-off number of 333 milliseconds. (aka 20 frames at 60fps)

The average human's reaction time is ~200ms, Some people are a little quicker, and some a little slower, but the vast majority of the population falls in the 170-230 range. We'll stick with 200ms for the purposes of calculation.

So right off the top, once you take away the 200ms of human delay, you are down to 133ms (eight 60fps frames) to react to the enemy attack. If the game is displaying at 30FPS, you are getting the visual information 16.7ms later. In addition, your input is getting to the display 16.7ms later as well, which combined is another 33ms removed from the original 333.

In this situation, your real timing window drops from 133ms to 100ms. You've literally lost a quarter of the time you have to react to the attack just due to framerate. If your personal reaction time is in the 230ms range, you lose a full third of your reaction window to framerate.
 

rodrigolfp

Haptic Gamepads 4 Life
Point to you sir!

It's just so disheartening to see so many closed mind individuals on such a meaningless topic. I can not tell the difference and none of you can prove to me otherwise.

get a shooter like quake 3 or whatever CoD/BF, with good mouse, use rivatunner to limit to 15/30/45/60/120fps, move your crosshair at each limit than come back here saying that you dont feel the diference...
 

Mockerre

Member
I don't think there has ever been 'legitimate' research into the subject, as scientists have better things to do ;) Maybe a video card manufacturer did some, but they'd be pretty suspect. So you're left with anecdotal evidence from players.

Personally, I have 60 fps. The game runs smoother, but looks like a cross between a cartoon and a tv soap opera, very 'fakey'.
 
I had a higher ranking with 100ms ping and a worse computer in League of Legends than I do now at 30ms ping and 100fps so in my incredibly large sample size I'll say playing with worse graphics makes you better.

A pro Dota 2 player told me he used to play the game with a terrible Internet connection and he thought it actually helped him in the end because it meant he had to learn to anticipate other players' actions.
 
To be clear, I can tell the difference in "experimental conditions". Having refined my switching technique so I can switch FPS with less screwing around, I can now actually see a difference in game.

What I can't do is tell a difference if it's not instantaneous, and I certainly can't if I'm not looking out for it.
Same here, unless im switching back and forth between 30fps and 60fps I struggle to notice. Maybe its because i play all my games in the morning and im too sleepy lol!
 
get a shooter like quake 3 or whatever CoD/BF, with good mouse, use rivatunner to limit to 15/30/45/60/120fps, move your crosshair at each limit than come back here saying that you dont feel the diference...

Never said I couldn't 'feel' the difference. Visually I have never seen any difference between 30 and 60 fps.

As soon as I get a PC I'll get right to your suggestion!
 

Manu

Member
Dark Souls 1 has physics tied to framerate, and Dark Souls 2 has weapon durability degradation tied to framerate (fixed with an update to Scholar of the First Sin).

Scholar didn't fix shit, all they did was changing it so that bonfires restored durability to mitigate the issue.

Bosses like Sinh or Lud&Zallen still eat through your weapons.
 

Tagyhag

Member
Point to you sir!

It's just so disheartening to see so many closed mind individuals on such a meaningless topic. I can not tell the difference and none of you can prove to me otherwise.

You can keep saying that 1+1 = 5 all you want, but that doesn't make it true.

What you're implying, is that you have some sort of condition that means that you can't tell the difference in refresh rates, at all.

So that means, that these 3 bars are all running at the same framerate and they look equally smooth to you.

UzQJun1.gif


It also means that you perceive real life at a different speed than everyone else in the world.

That technically makes you superhuman!

To be clear, I can tell the difference in "experimental conditions". Having refined my switching technique so I can switch FPS with less screwing around, I can now actually see a difference in game.

What I can't do is tell a difference if it's not instantaneous, and I certainly can't if I'm not looking out for it.

And that's totally fine, because it happens to a lot of people, especially if someone doesn't play in high framerate all the time.

Like I said, there's a difference between people who don't really notice it unless they're shown specific tests, and those that say there is never a difference no matter what.

One is physically possible, the other isn't.
 
You can keep saying that 1+1 = 5 all you want, but that doesn't make it true.

What you're implying, is that you have some sort of condition that means that you can't tell the difference in refresh rates, at all.

So that means, that these 3 bars are all running at the same framerate and they look equally smooth to you.

UzQJun1.gif


It also means that you perceive real life at a different speed than everyone else in the world.

That technically makes you superhuman!



And that's totally fine, because it happens to a lot of people, especially if someone doesn't play in high framerate all the time.

Like I said, there's a difference between people who don't really notice it unless they're shown specific tests, and those that say there is never a difference no matter what.

One is physically possible, the other isn't.

Oh man I bet you felt big posting this reply!

Your chart, sure I can see a difference. In a VIDEO GAME, between 30 and 60fps, no difference. None at all. Let that pop a vein out of your collective foreheads.
 
D

Deleted member 325805

Unconfirmed Member
If I cap Overwatch to 30fps I'll definitely play worse as it'll be choppy with higher input delay.
 
Oh man I bet you felt big posting this reply!

Your chart, sure I can see a difference. In a VIDEO GAME, between 30 and 60fps, no difference. None at all. Let that pop a vein out of your collective foreheads.
Why are you posting like you're superior or something because you really want to believe you don't see a difference in framerate?

I truly appreciate the effort being put into trying to convince me that I can see something I can't.
Ugh
 

Crossing Eden

Hello, my name is Yves Guillemot, Vivendi S.A.'s Employee of the Month!
I truly appreciate the effort being put into trying to convince me that I can see something I can't.
What do your eyes see in that gif above? As video game framerates don't magically function differently. This is either willful ignorance or you're straight up lying.
 
Why would I need a definitive study?

I've played numerous games at both 30 fps and 60 fps. I *always* play better when the game runs at 60 fps.
 

Izuna

Banned
It's funny because I've had my laptop running 100hz past few days and 60fps looks noticeably slower to me.

I personally don't believe people who claim they can't see 30fps.
 

Tagyhag

Member
Oh man I bet you felt big posting this reply!

Your chart, sure I can see a difference. In a VIDEO GAME, between 30 and 60fps, no difference. None at all. Let that pop a vein out of your collective foreheads.

Ah, and that's fine! Again, if someone's pretty much just console gaming, it's going to be harder for them to tell the difference because they're used to lower framerates in general.

It's easier for someone who games almost exclusively on 60fps and higher to notice the differences.

It sounded like you were implying that you couldn't tell the difference in anything. :p

We cool then!
 

arcticice

Member
i was also under the impression that the difference can't be that huge. but playing R6 Siege's multiplayer at 60fps is an awesome experience as compared to R6 Siege's Terrorist hunt co-op mode which runs at 30fps.

Those of you who can't/won't/don't understand the difference between the frame rates, i highly suggest giving Siege a try for this reason only. The difference is mind boggling. Although, u get used to either soon after, but 60fps is better experience in every way.
 

Keihart

Member
I think there are clear examples of how this works that people don't like to recognize because how much reaction gets overstimated when prediction is more important in games.

Take Tekken 7 , 7 frames of delay. That is more input lag than what you would gain by playing at 30 fps.

7 frames of delay are above 100 ms when playing at 60 fps. Playing at 30 fps would add at most 20 ms of delay.(google says that the fastest reaction time is of 100 ms and average of 200ms for reference)

So input lag is really not affected that much when the game is designed around 30 fps, maybe you can make a case in the amount of information you can see, but i have no idea how to measure that.
 
Ah, and that's fine! Again, if someone's pretty much just console gaming, it's going to be harder for them to tell the difference because they're used to lower framerates in general.

It's easier for someone who games almost exclusively on 60fps and higher to notice the differences.

It sounded like you were implying that you couldn't tell the difference in anything. :p

We cool then!

*fistbump
 
Throwing it out there that as someone who built a PC primarily to get 60 fps in more titles, I legitimately don't think there's a big difference in slower paced games.
 
Playing at 30 fps would add at most 20 ms of delay.(google says that the fastest reaction time is of 100 ms and average of 200ms for reference)

So input lag is really not affected that much when the game is designed around 30 fps, maybe you can make a case in the amount of information you can see, but i have no idea how to measure that.
Really not sure that's how input lag works.
 

Spacejaws

Member
I uh own a laptop that can play The Witcher 3 at capped 60fps settings with medium/low settings. Inplayed like that for a few days until I tried capped 30fos with all settings maxed and never went back, game looks stunning.

In a sinilar scenario Inplayed DOOM on high settings and it rocked 60fps mostly. It really depends on the game, something like the Witcher I love the glossy feel and the scenery.
 
I uh own a laptop that can play The Witcher 3 at capped 60fps settings with medium/low settings. Inplayed like that for a few days until I tried capped 30fos with all settings maxed and never went back, game looks stunning.

In a sinilar scenario Inplayed DOOM on high settings and it rocked 60fps mostly. It really depends on the game, something like the Witcher I love the glossy feel and the scenery.
Doesn't depend on the game. It depends on having a consistent framerate. That's why even games that technically could go higher than 30fps on console are capped at 30 because they can't consistently stay at 60. And also, of course the game will look better with higher graphics settings.
 

rodrigolfp

Haptic Gamepads 4 Life
Dark Souls 1 has physics tied to framerate, and Dark Souls 2 has weapon durability degradation tied to framerate (fixed with an update to Scholar of the First Sin).

i know and not a single strange thing happened with DSFix. on SotFS the double degradation happened and just that...
 
A game's engine could refresh at 20fps but output graphics at 60fps (interpolating the positions between inputs, example online multiplayer games), or the engine could run at 60fps but output graphics at 20fps (example dropped frames), so maybe it's important to specify which one. I'd wager the first one would "feel" more responsive than the second one.
 

recursive

Member
Some people can play games at 20fps. Kinda hard to throw science at them when their enjoyment is equal to ours.

Sure, that will also depend on the game. But the question here is performance of the gamer not the level of enjoyment. Although I can say I enjoy most games more with a higher frame rate.
 

GametimeUK

Member
60fps is not "objectively" better than 30fps. Let's face it, it's almost universally agreed on that it is better, but it isn't actually "objectively" better. It's still a matter of opinion.
 

King_Moc

Banned
No, researchers don't undertake studies on unquantifiable subjective internet arguments

It's not subjective.

Seriously, I challenge anyone to play a Mario game or something similar on an emulator with it locked to 30fps and not find it a complete mess.
 

Izuna

Banned
60fps is not "objectively" better than 30fps. Let's face it, it's almost universally agreed on that it is better, but it isn't actually "objectively" better. It's still a matter of opinion.

L3iPWP0.gif


how so? (the bolded part about google is a reference to how fast humans react so you can put in perspective the input lag in games)

reaction time != input lag

I can't wrap my head around what's so confusing about this.

If you have a PC, go to any old game you can max out and use the UI with VSync on and off
 
Top Bottom