Your opinion is bad.
Many more people (or at least from those who bothered to comment or take polls etc) disliked the way the two HFR Hobbit movies looked in motion than liked. It had a very obnoxious soap opera effect going on that made it look comparatively awful to the 24fps version.
Saying that someone's opinion is bad because they don't like the way videos look at higher framerates compared to video games is a bad opinion. The two aren't even similar, because one is completely passive while the other interactive.
Unless playing an Uncharted game of course.
Their opinion is bad, too.Many more people (or at least from those who bothered to comment or take polls etc) disliked the way the two HFR Hobbit movies looked in motion .
It's impossible for moving images to not look worse at lower framerates. Higher framerates cause video to become more lifelike, more realistic. Low framerates cause the image to become a juttery, blurry mess when the camera pans - higher framerates help reduce or eliminate the disconnect from reality that occurs when this happens and allow you to remain immersed in the video. Preferring lower framerates for anything is a bad, incorrect opinion.
I know right?! But then, he kind of made me uncertain. He explained it as this: The human eye has a response-time of 30hz, where everything completely in sync with this would look completely smooth (if it has motion blur and so forth) but that we perceive higher framerates better because it means there's more information leading to a smaller chance of your eye getting out of sync with the video.
Well he's full of shit. I can barely tell the difference between 30 and 60 unless I'm looking for it, but I still notice significant frame-drops immediately because it feels like I'm dragging the control stick through mollasses.He dismissed this too.
This isn't something where we need to give out participation trophies to everyone, or be accepting of subjective opinions, accepting people's differences.Conversely, fuck you. I usually can't tell the difference. It has to be something with intense movement like F Zero for me to tell the difference. I've never denied that other people can tell the difference, but you don't have to be a dick about other people.
Each eye can see 30fps and thus 60 fps for most of us
It's impossible for moving images to not look worse at lower framerates. Higher framerates cause video to become more lifelike, more realistic. Low framerates cause the image to become a juttery, blurry mess when the camera pans - higher framerates help reduce or eliminate the disconnect from reality that occurs when this happens and allow you to remain immersed in the video. Preferring lower framerates for anything is a bad, incorrect opinion.
Since silent films gave way to talkies in the 1920s, the frame rate of 24 frames per second has become standard in the film industry. 24 fps is not the minimum required for persistence of vision--our brains can spin 16 still images into a continuous motion picture with ease--but the speed struck an easy balance between affordability and quality. For the past century, cinema has trained us to recognize 24 frames per second as a reflection of reality. Or, at least, a readily acceptable unreality.
But to the average viewer, 48 fps looks like an exaggerated version of a television program shot at the common video tape speed of 30 fps.
How do I hate thee? Let me count the ways
Smith's attentional theory makes me wonder if the way we watch movies is something we learn subconsciously from the first moments we sit in front of a television. "Certainly there's a familiarity effect," he says, after thinking about it for a moment. "We're aware of what a cinematic image looks like. Now we're aware of what a TV image, even a high def TV image, looks like. So when we see the film image at 48 frames per second, it's queueing all of our memories of seeing something similar in TV. That's why people are calling it the soap opera effect or bad TV movie effect. Because that's what it looks like, what it reminds us of.
"Whether we learned to expect it to be a certain frame rate...I don't think it operates on that level. These low level sensory behaviors are something we don't really have conscious access to, we can't really control it. Our eyes know how to make sense of the real world, and they know how to make sense of a still painting or a movie. We will change what we interpret in the image based on the way it's presented to us. But we can't really see the frame rate directly. We can only see the consequence of it in what we perceive."
This is perhaps the simplest and clearest explanation of why High Frame Rate projection looks so unpleasant: It contradicts our memories and expectations. Even if we haven't "learned" to expect 24 fps playback, by this explanation kids could grow up watching 48 fps video and find it perfectly palatable.
On a surface level, 48 fps motion simply looks too fast, like a movie playing on fast-forward. But there's something specific about HFR that makes it especially unsettling, which may explain why we find it horrifying in film but don't mind it in video games: the uncanny valley.
dude's in denialI tried showing him the comparisons, but he refused.
I've always wondered if we used more then 10% of our brain if we could see more then 30fps with our eyes.
That's a bullshit article.Regardless, we aren't even seeing "frames" anyway, we are perceiving the fluidity of motion. According to this, we perceive reality at a rate somewhere between 24 fps and 48 fps.
http://movieline.com/2012/12/14/hobbit-high-frame-rate-science-48-frames-per-second/
That's a different question from "what frame rate do I need for smooth motion"."Tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second."
http://www.100fps.com/how_many_frames_can_humans_see.htm
If you account for temporal aliasing, you may see changes above 1000 Hz, if you don't 300 Hz is way overkill.There is still a perception issue, there'll be a point where you just can't tell the difference any more, but we can still perceive changes in framerate up to 300fps .
No one was talking about watching movies at 10fps. And while filming at higher framerates might create a more "lifelike" reproduction, that doesn't actually make it a better experience. Video games on the other hand have many other considerations when factoring in higher framerates.
http://www.tested.com/art/movies/452387-48-fps-and-beyond-how-high-frame-rates-affect-perception/
That's exactly what it looked like to me.
It's like the simplest maths that the higher the frame rate, the lower the response time. If you have 30 frames and then go to 60 frames, you have double the frames. If you have double the frames, you also have faster input response time by 2 times.
What you are seeing here is the most advanced built-in 30/60 frame switch.Well, that does not include you then.
That's only true if you have perfect temporal anti-aliasing (which necessitates eye-tracking).I remember when I was in college doing some research for a Video & Graphics presentation, I came across some biological studies saying that the human eye can't see any difference in any motion above 100 fps
http://www.bbc.co.uk/rd/blog/2013/12/high-frame-rate-at-the-ebu-uhdtv-voices-and-choices-workshop
Being elitist doesn't change your vision.
There IS a difference between 30 and 60 FPS. It's a fact. Don't see how it's even debatable to be honest.
Whether or not you mind is another thing. Just don't deny the difference.
I have glasses and i started to noticed some lag because of them, i see everything pretty much 0,5 sec later ;( which really sucks in traffic.
I need thinner glasses maybe i can get lag down a bit.
Wait what? Is this a joke?
Each eye can see 30fps and thus 60 fps for most of us
The response time comment was in response to OP's comments in the thread where he said the guy denies that too completely. But yes, increasing frame rate doesn't always decrease the response time.Response time isn't even what's being discussed. You can have a 120hz screen that looks better in motion, but has an equivalent response time of a 60hz screen. Just like you can have one 60hz screen with a 12ms response time and another that's 40ms. A slow response time is what would cause ghosting in the older LCD TVs and monitors. Motion blur is used to make 30fps games look smoother, but that's not quite the same thing as ghosting and it's often simply used for effect.
Higher framerate is objectively better if the question is "Which provides a more authentic experience?" in terms of relation to real life, or at the very least the reality of what it looked like on set.
I don't think that's the question you're asking though. I think the question you're asking is "Which do I like more?" which is of course open to interpretation by every individual.
I think the both of you are arguing different things.
Also worth noting is that higher framerate isn't inherently better, as plenty of people don't like motion interpolated movies and TV. The motion interpolation doesn't remove any of the original frames, and only adds estimated data so if the argument is "higher frame rate is always better" whoever is arguing that has to sort of work around that issue.
I've always wondered if we used more then 10% of our brain if we could see more then 30fps with our eyes.
That's a different question from "what frame rate do I need for smooth motion".
The 10% idea is also a myth.
I remember when I was in college doing some research for a Video & Graphics presentation, I came across some biological studies saying that the human eye can't see any difference in any motion above 100 fps
Regardless, we aren't even seeing "frames" anyway, we are perceiving the fluidity of motion. According to this, we perceive reality at a rate somewhere between 24 fps and 48 fps.
http://movieline.com/2012/12/14/hobbit-high-frame-rate-science-48-frames-per-second/