I am not a scientist, but being in the graphics field..
I really don't think there is one answer for this question. It honestly probably depends on the person. There may be some people that can't distinguish over 100fps, and some people can distinguish anything under 200fps. Each human eye is different. Just like everyones vision isn't exactly the same (by evidence of the need for some sort of corrective lenses), I doubt everyones perception of framerate is identical.
Though, I do think some people are not lying when they say 60 and 120 isn't much of a difference to them in slower - medium movement scenes. 60 and 120 is a large difference, but that doesn't mean everyone can perceive it. Some people may be more sensitive to the nuances of framerate than others.
I remember when I was in college doing some research for a Video & Graphics presentation, I came across some biological studies saying that the human eye can't see any difference in any motion above 100 fps
Take a big camera flash and flash him right in his face. Ask if he saw the 1/1000 second of light.
That is not a test of sampling rate mr scientist /o\
I am not a scientist, but being in the graphics field..
I really don't think there is one answer for this question. It honestly probably depends on the person. There may be some people that can't distinguish over 100fps, and some people can distinguish anything under 200fps. Each human eye is different. Just like everyones vision isn't exactly the same (by evidence of the need for some sort of corrective lenses), I doubt everyones perception of framerate is identical.
Though, I do think some people are not lying when they say 60 and 120 isn't much of a difference to them in slower - medium movement scenes. 60 and 120 is a large difference, but that doesn't mean everyone can perceive it. Some people may be more sensitive to the nuances of framerate than others.
What FPS does reality run? (and yes it does run in frames... there is a smallest unit of time called a Planck Unit, 10−43 seconds). An interesting question TBH. All a bit OT but rather interesting (I am not going to explain all this... I am sure all of you interested can Google). Let's just say that reality runs at a VERY high framerate A 1 with 43 zeros behind it.
Then again time is relative... that means that that framerate of reality is slightly different for each of us (if we are moving ofc).
Reality Lag!
Disliked probably just means it felt different. People are so used to 24fps in movies that anything different will feel weird to them and that translates to "bad".Many more people (or at least from those who bothered to comment or take polls etc) disliked the way the two HFR Hobbit movies looked in motion than liked. It had a very obnoxious soap opera effect going on that made it look comparatively awful to the 24fps version.
after the 48fps becomes standard, future generations will watch old 24fps movies and wonder how could we live with this shit.
Your friend is so wrong ... But we can't see the difference between 720 and 1080 on a screen less than 50" though.
Indeed... I hate that they always mention that.
That 10% of our brain is there for a reason... because the brain can't regenerate, so it needs some spare "parts" to move stuff to when old stuff fails. Anyway... a bit OT
Your friend is so wrong ... But we can't see the difference between 720 and 1080 on a screen less than 50" though.
Your friend is so wrong ... But we can't see the difference between 720 and 1080 on a screen less than 50" though.
I don't understand how a PC master race guy (op's self proclaimed friend) can't tell the difference between 30 and 60?
It is like night and day?
Does your friend work at IGN?
your friend needs to upgrade his eyeballs
Embarrassing..
At 16 bit too!Tell him: "The eye can only see 30fps tops, but since you have 2 eyes, you can see 60 fps!"
Right, but that's not the same as saying that your perception has good temporal resolution. Being able to detect the presence of brief stimuli and being able to distinguish two temporally closely-spaced events are two entirely different things. The former is a test of sampling coverage (which the eye should excel at), the latter is a test of how clearly those samples are resolved (which the eye isn't quite so good at, though of course the 12fps thing is absurd). If you were in a dark room and a very sharp impulse of photons hit your eye in a one-nanosecond period you'd probably be able to pick that up too, but that doesn't mean there'd be meaningful benefit to using a screen with anything remotely on the order of 1 billion fps.Well it clearly establishes you can detect events shorter than 1/30 seconds.
The difference between 30 and 60 fps is obvious.
Past 60 fps there's almost no difference (I can't see any, personally.)
...
I mean ive never played above 60fps really since i dont have a nice pc but afaik 120 should be smoother than 60?
I've used a 120 Hz screen at a friend's house and there's a pretty big difference. It's much more minor than the difference between 30 FPS and 60 FPS but that's due to frame time. 30 FPS = 1 image every 33.3 ms, 60 FPS = 1 image every 16.7 ms. You can see how halving that more and more will lead to smaller differences between perception of smoothness.
I'm an Optician.
Your friend is an Idiot.
You can choose any of my suggestions below
a) You can get a new friend
b) You can educate your friend
c) You can get a new friend
(Don't need to be an Optician to know any of that)
d) Tell him to go see an optician.