completely in awe at some of the replies in here. how is it not painfully obvious which one is 60? now im no eye doctor, but im genuinely curious as to how some people are unable to tell. is it an eye thing? does it have something to do with how the eye transmits images to the brain? no offense intended, but id like to read more about this.
There's several components to it:
-The display's quality/refresh rate
-At what rate your eye receives light (the image)...this capacity is basically infinite
-At what rate your optic nerve transmits what the eye sees to the brain (as electrical impulses)
-At what rate your brain deciphers the impulses
-Brains also have thresholds of response, for example if you don't get the same signal enough times within a window defined in milliseconds, it will not cross your "response threshold" and will not trigger your deciphering mechanisms
-Like anything your brain can be trained to be more responsive to certain fps (sidebar: consider the people who argue that all movies should be 48fps like The Hobbit even if it looks weird to audiences now because children raised on 48fps won't think it looks weird once they are cash-spending 18-35 demos)
-It would be interesting to know if all the aces on this test are primarily PC players or not, what kind of displays are used, etc, but that is beyond the scope of the topic
Honestly I decided to give a few journal databases a skim to see what kinds of research has been done on fps and the answer seems to be: not a lot. Outside of art/movies/games there doesn't seem to be a whole lot of application to fps and people who actually care about fps may be a very small subset of the consumer population. Other people that care about fps are
in physics, but they're not interested in how it relates to the human eye.
Going back to The Hobbit, most people don't complain about movies being choppy even though movies are typically recorded at 24fps. Instead, people say The Hobbit @48fps looks "weird" or more like a TV show than a movie. I could definitely tell the difference on the movie but I can't tell the difference on this test. Possibilities: monitor sucks, easier to notice fps changes in a large image, "brain training" is different for movies (large-format) and games (small-format), etc. I want to muse about The Hobbit using real images instead of CGI as well but considering the movie that claim can't really be made.
We need to put 300 GAFs with a balanced assortment of gaming pedigrees in a room with 300 120Hz monitors and see what happens.
tl;dr: It's not the eyes, it's the display equipment and sensitivity threshold of the brain.