Why be so dramatic about it? Instead of saying that it doesn't run as smooth or look as polished (assuming it doesn't have good post-processing, motion blur, etc. at 30fps), you use a hyperbolic descriptor like "a complete mess", when they clearly aren't. If that were objectively true, a game like SM64 would be inferior to NSMB2 just because of framerate. That's how silly it sounds.It's not subjective.
Seriously, I challenge anyone to play a Mario game or something similar on an emulator with it locked to 30fps and not find it a complete mess.
So this is second-hand information?
You're not going to "End of story" anyone's opinion.
This is as objective as eggs giving more protein when cooked.
Unless it's SFV, where you can't really react in time because input lag is too high, haha.
Eh, yes of course. Hence why Call of Duty is so popular.
So has anyone posted actual studies demonstrating statistically superior performance of gamers at 60fps? Not people saying "duh its better", actual studies demonstrating an improvement in player outcomes. Because I saw nothing on the first or last page.
http://m.neogaf.com/showpost.php?p=241996065So has anyone posted actual studies demonstrating statistically superior performance of gamers at 60fps? Not people saying "duh its better", actual studies demonstrating an improvement in player outcomes. Because I saw nothing on the first or last page.
So has anyone posted actual studies demonstrating statistically superior performance of gamers at 60fps? Not people saying "duh its better", actual studies demonstrating an improvement in player outcomes. Because I saw nothing on the first or last page.
It doesn't matter where the information comes from. The fact of the matter is 60fps was unplayable to someone due to motion sickness and the 30fps option was better for her. It is okay to get information from someone else instead of thinking your own opinions and preferences are objectively better than everyone else's. All I know is that I am the only one correctly using the word and for such a dumb person (which I am) I can't believe how many people here fail to grasp basic English and comprehension.
So has anyone posted actual studies demonstrating statistically superior performance of gamers at 60fps? Not people saying "duh its better", actual studies demonstrating an improvement in player outcomes. Because I saw nothing on the first or last page.
What do you think is subjective? The benefits of 60fps? Or that it is preferred?
Because of one those, I'll fight you.
The other, I'll just think you're cray and move on.
So has anyone posted actual studies demonstrating statistically superior performance of gamers at 60fps? Not people saying "duh its better", actual studies demonstrating an improvement in player outcomes. Because I saw nothing on the first or last page.
The thread title and content is incoherent. You probably wanted to title the thread "do gamers perform better above 30 FPS?" Most responses are off-topic as a result.
Here's another study including input lag with a quick google search...took all of 10 seconds...
http://www.csit.carleton.ca/~rteather/pdfs/Frame_Rate_Latency.pdf
Also, on a personal level, it's painfully obvious if you try to play something like Quake at 30, 60, uncapped FPS respectively that you perform better at higher FPS.
All I know is that I am the only one correctly using the word and for such a dumb person (which I am) I can't believe how many people here fail to grasp basic English and comprehension.
"Let's start out with 60fps looks objectively better than 30fps"
Is the sentence in the OP that I took issue with. But I think we have reached middle ground with your comment here. I was talking about being preferred aesthetically which is a bit silly of me because I jumped the gun instead of debating the main point of the actual OP. Thanks for the discussion. I guess I could have been more clear in my debate, but I think my point stands in relation to that sentence in the OP.
By no means what you are asking for but i did experemient with frame rate and game performance perception years ago when i had a PC available.So has anyone posted actual studies demonstrating statistically superior performance of gamers at 60fps? Not people saying "duh its better", actual studies demonstrating an improvement in player outcomes. Because I saw nothing on the first or last page.
"Yeah, even though evidence supports that the Earth is not flat, it's still an opinion"I wonder how many in here believe the earth is flat. I mean the mental gymnastics wow!
I think it's people trying to justify their console purchases. Or because their favorite games are console exclusives at 30 fps.It's kinda insane there are still arguments about this, on a gaming enthusiast site of all things.
So has anyone posted actual studies demonstrating statistically superior performance of gamers at 60fps? Not people saying "duh its better", actual studies demonstrating an improvement in player outcomes. Because I saw nothing on the first or last page.
Retroheads refer to old games that were made with old CRT screens in mind. Yes, those games look better on inferior resolutions because they fit better.You will have retro heads that will swear blind that some of the best visuals come from running games on a CRT. Sorry, looks like a lot of people don't know the definition of the word in this thread. It's still subjective.
This is all the research you need.It reduces input delay
So has anyone posted actual studies demonstrating statistically superior performance of gamers at 60fps? Not people saying "duh its better", actual studies demonstrating an improvement in player outcomes. Because I saw nothing on the first or last page.
60fps.What frame rate does Super Mario Kart on SNES run at?
Input lag matters far more than the actual framerate, in terms of gameplay success. If you have a 30FPS game with a really low input lag, like new R&C for example, you may find yourself being more precise in it than in some 60FPS game with a high input lag.
*HOWEVER* It is very, very rare to find any 30FPS game with lower input lag than just about any 60FPS game. Higher framerate in and on it's own provides lower input lag. If you have a game with low input lag in 30FPS, it would be even lower if it was running at 60. It's true that for many games this won't really matter if the lag is low in 30FPS to begin with, but if the input lag is high at 30, having the game run at 60 can help tremendously. Just try playing Rise of Tomb Raider on PS4Pro. It will let you chose to play at 30FPS and also at 60FPS (more or less, but it's 60 FPS in many areas where it matters for this comparison anyway). Get into any firefight or try to hit a moving target, and see how you do in each mode. I found some of the moving targets practically impossible to hit in 30FPS mode in this game, but I could hit them with ease in 60FPS mode.
Also, playing with a mouse is a whole another ballgame. There, due to expected 1:1 hand/screen update response, anything less than 60FPS feels like nightmarish molasses, compared to playing something at 30FPS with a controller.
I think the only way to properly test this is if you could find a game that has the exact same input lag in 30FPS as it does in 60FPS. But I don't know how that can be arranged.So has anyone posted actual studies demonstrating statistically superior performance of gamers at 60fps? Not people saying "duh its better", actual studies demonstrating an improvement in player outcomes. Because I saw nothing on the first or last page.
30 fps > 60 fps
games look more cinematic
Retroheads refer to old games that were made with old CRT screens in mind. Yes, those games look better on inferior resolutions because they fit better.
There's also the argument that CRTs still have better motion resolution than modern panels, which is ideal for 2D scrolling games (like most older games are, what a surprise)
That doesn't make higher resolutions not better than lower resolutions though.
You're objectively wrong about the definition of objective. This is such a weird hill to die on man. Don't do this to yourself.
I've had people on this very site tell me that Super Mario Galaxy looks better on an old school set up as opposed to using the Dolphin Emulator. None are better than the other... because it's subjective
and I am right as usual.
Yes but what was the reason they liked the old school setup better?I've had people on this very site tell me that Super Mario Galaxy looks better on an old school set up as opposed to using the Dolphin Emulator. None are better than the other... because it's subjective and I am right as usual.
best post yet i think
I couldn't tell you if anyone put an actual studies up proving this to be fact. I think the best way to gauge this and get some good feedback is to look at some of the competitive online gaming communities, like CS:GO (which I just mentioned), or Overwatch, or maybe even LOL, and even older groups like Quake Live. Generally speaking, I think you will find that more competitive people will drop game details and turn up framerates to get any advantages they can. I think it is proven true when you look at these groups in particular.
Thank you.
That's not as iron clad as you might think. While there's no argument to be made that higher framerates "feel" better and more responsive, the question is really whether this makes a statistically significant impact on gaming performance, which is something that moves beyond simply whether we can perceive higher framerates and think they look nicer. Even if you or I can notice the difference between 60hz and 144hz (and I can, I have a 144hz monitor and a 980ti), but is this the limiting factor on my or any other human's gaming performance? I severely doubt that it is. The 30->60 jump would probably not even amount to much once you translated the study from a very simple latency test (as the study above someone quoted) into a full gaming environment. Differences that do exist get swamped under a mountain of confounding variables.
Thank you.
That's not as iron clad as you might think. While there's no argument to be made that higher framerates "feel" better and more responsive, the question is really whether this makes a statistically significant impact on gaming performance, which is something that moves beyond simply whether we can perceive higher framerates and think they look nicer. Even if you or I can notice the difference between 60hz and 144hz (and I can, I have a 144hz monitor and a 980ti), but is this the limiting factor on my or any other human's gaming performance? I severely doubt that it is. The 30->60 jump would probably not even amount to much once you translated the study from a very simple latency test (as the study above someone quoted) into a full gaming environment. Differences that do exist get swamped under a mountain of confounding variables.
There's some studies on page 2 that I pulled some charts from. While there is an increase in performance from 30->60 it's very slight especially when compared with going from 15 to 30 fps. There are diminishing returns and only margin of error levels of difference after 45fps according to the second study.
I think the differences would be more apparent with higher level play. We have professional fighting game players talking about how 8 frames of input lag at 60fps is so much worse than 4. To those players, 30 vs 60 fps would make a greater difference than an average hardcore/experienced gamer. I think the studies need to be done with that in mind - self-identified experienced gamers are still likely not at the level of competitive professionals.
I think the differences would be more apparent with higher level play. We have professional fighting game players talking about how 8 frames of input lag at 60fps is so much worse than 4. To those players, 30 vs 60 fps would make a greater difference than an average hardcore/experienced gamer. I think the studies need to be done with that in mind - self-identified experienced gamers are still likely not at the level of competitive professionals.
Yes but what was the reason they liked the old school setup better?
I'll tell you why. CRT motion resolution. CRTs don't have that shitty motion blur of modern displays. That gives them much better clarity in motion.
So it wasn't a matter of 480p vs higher res Dolphin. It was all about motion again. People preferred the CRT not because of resolution but because of motion clarity. And yes, resolution VS motion is subjective.
Smoother motion vs not as smooth motion is not.
I think the differences would be more apparent with higher level play. We have professional fighting game players talking about how 8 frames of input lag at 60fps is so much worse than 4. To those players, 30 vs 60 fps would make a greater difference than an average hardcore/experienced gamer. I think the studies need to be done with that in mind - self-identified experienced gamers are still likely not at the level of competitive professionals.