• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

48 movies fps vs 60 game fps?

No,

It does not matter how much something moves in world space!

As long as it takes up the same amount of space in your vision, it will look the same!

Motion blur in games is just an artistic choice that makes lower framerates less hard on the eyes due to it blending frames together in to a blurry mush.
Wow, this is so wrong it hurts.

Again, the way rendering is done in video games is like a camera being run at 1/200000 shutter speed (that's fractions of a second by the way). It looks completely unnatural. Your monitor cannot produce that kind of motion blur, and even if it could (by running at some obscenely high frame rate, most likely), it'd still be better to add the motion blur as a post process because it's considerably less expensive.

Edit to clarify: Let me put it another way. The reason film blurs is because it is recording an instance in time. If a video is being recorded at 1/24 shutter speed (that's one twenty-fourths of a second), it is going to record all the movement done within that span of time, and all of that movement results in blur.

Your monitor cannot do this.

I'm just going to link this guy: http://www.volnaiskra.com/2012/03/why-more-games-should-support-motion.html

Try the following experiment:
Hold up a book in front of your face, at a comfortable reading distance
Now move the book around in circles, fairly fast
As soon as you move it, can you read the text? It's all a blur, right? Even if you follow the words with your eyes, the movement makes them significantly harder to make out.

Not surprised? Wondering why I asked you to do such a pointless experiment with such an obvious outcome? Because many people in the gaming world, including some of the biggest game developers, don't seem to understand this simple fact. Take a look at the following image, from 2011's Rage:

rage.jpg


That picture was taken while the camera was moving. Moving very fast, in fact. Yet it's crystal clear: you can read the text on the shop sign, you can see fine details in the clouds. For the main character of Rage, the world looks exactly the same when he's whizzing past it as it does when he stands perfectly still. It's one of the most visually unrealistic elements of the game engine.

It's perhaps unsurprising that Rage has such a graphical flaw, since the quality of game's engine been harshly criticised by many, mainly for its disappointingly lo-fi textures. But Rage is far from alone, and the games that lack motion blur probably outnumber those that don't.

In fact, there are many gamers who still think that motion blur is some kind of gimmick or over-zealous eye-candy effect, or that its sole purpose is to lessen the jitteriness of low framerates. This is way off. Actually, motion blur is an attempt to bring an element of realism into gameworlds in the same way lighting and shadowing are - one that's too often being missing. The medium of games is actually the odd one out here - almost no other medium provides this unnatural and unrealistic level of sharpness during motion.
 

Wonko_C

Member
Speaking of motion blur, in games the higher the framerate the harder it is for me to notice it. I tried this capping Street Fighter IV on PC to 30 fps and it is amazing how much motion blur goes unperceived when playing at its normal 60 fps.
 
Speaking of motion blur, in games the higher the framerate the harder it is for me to notice it. I tried this capping Street Fighter IV on PC to 30 fps and it is amazing how much motion blur goes unperceived when playing at its normal 60 fps.
I believe this is because the motion blur is capped to blur X number of previous frames together.

For example, if a game's motion blur takes the 3 last frames and blends them together (with the oldest frame being the most faded), when you're running at 30 FPS, that's actually 1/10 of a second; whereas running at 60 FPS, it'll become 1/20 of a second (these are just hypothetical numbers, obv - I don't know how the devs actually do it). Hence, it just becomes less noticeable at higher frame rates.

Ideally, though, they should be measuring in time units instead of raw frames to maintain a consistent look, but I can understand that doing it that way just makes things a LOT more complicated.
 

apana

Member
Just as sound, color and 3D. Technical advancements aren't needed to make good movies, but they allow to make different movies, and that's good.

All I'm saying is I am going to see it for myself and let people have the reactions they have and then analyze it. Not all technological advancements are created equal, people may or may not like something for whatever reason. People reacted very differently to sound and color than they did to 3D. There is no need to be dogmatic about an issue like frame rates.
 
Higher framerate films will do wonders in cleaning up ugly judder, but I hope that HFR film starts replicating motion blur at some point. Every 48fps video I've watched feels like it's sorely lacking and it looks slightly unnatural, even though it does solve a lot of the ugliness of the 24fps format.

I dunno. I'm anxious to see The Hobbit in 48fps and 24fps to make a better judgment. 60fps in games though, always and without question.
 

Tess3ract

Banned
Who knows.

BTW, 3D becomes magical at 60fps. 1080P @60fps has such amazing animation. IMO framerate is even more beneficial to 3D than 2D.

Then again, I seem to remember reading stories where people have made the dizzy claim with every tech advance Hollywood has tried in the past 100 years. So it seems par for the course.

I don't really get the size thing either. People sit so far away, the typical movie theater has a smaller FOV than sitting 2 feet from a monitor. So I suspect these people would be having the same issue with 3D games on a small monitor 2 feet away.

EDIT: Yikes at all the errors. And that's why you preview a post.

If you think 60fps in 3D looks good, you'd wet yourself at 3D at 120FPS.

Basically the reason it looks good is because each of your eyes sees about 30fps. 60FPS looks even better.

Then there's 240hz/240fps (which is out of the realm of most things unless you're playing Unreal 2004 or something.
 

Gek54

Junior Member
lol what? faster fps doesn't mean faster reaction time, it means you get to see the thing that you have to react to at an earlier time.

if a signal takes 16ms to reach you, and it takes you 200ms to react to it, then the total time is 216ms. if a signal takes 33ms to reach you, and it takes 200ms to react it, then the total time is 233ms.

reaction time has nothing to do with it. if anything, it'll be limited by the max fps that the human eye can see. and i don't think that cap is 30fps.

Also twice the frame rate means you have twice as many reference points to better judge motion before reacting to it.
 

jbueno

Member
That 48 FPS trailer looks fantastic, can't wait to see it. I was honestly expecting it to look awful after reading loads of complains.
 
Higher framerate films will do wonders in cleaning up ugly judder, but I hope that HFR film starts replicating motion blur at some point. Every 48fps video I've watched feels like it's sorely lacking and it looks slightly unnatural, even though it does solve a lot of the ugliness of the 24fps format.

I dunno. I'm anxious to see The Hobbit in 48fps and 24fps to make a better judgment. 60fps in games though, always and without question.
The motion blur is easily solved by setting the shutter angle to a lower speed, since shutter angle is separate from frame rate.

It is a conscious choice to increase shutter angle alongside frame rate. It is not an intrinsic factor.

Then there's 240hz/240fps (which is out of the realm of most things unless you're playing Unreal 2004 or something.
This one just made me laugh. I mean, golly, that's so far out of the realm of possibility that I can't even fathom it. Ten times as much storage space would be needed to record video at that frame rate. Four times as much GPU and CPU power would be needed to play games at that kind of frame rate (assuming 60 FPS was the standard... which, let's face it, it probably never will be (edit: at least for consoles)).

I'd like to see it, just for the novelty of it, but I somehow doubt the benefits would be anywhere near as clear as the jump from 24 -> 60.
 

Tess3ract

Banned
This one just made me laugh. I mean, golly, that's so far out of the realm of possibility that I can't even fathom it. Ten times as much storage space would be needed to record video at that frame rate. Four times as much GPU and CPU power would be needed to play games at that kind of frame rate (assuming 60 FPS was the standard... which, let's face it, it probably never will be).
I was making the comparison. They do have 240hz tvs.I was coming up with an older game that you could conceivably play right now at 240 fps
 
It's okay, we can't all be good at everything.

You do always find a way to come off as passive aggressive...

60fps on consoles isn't for me as important as on PC where you need the instant input with no delay or stutter.

I'd just like all games to have at least 30fps locked with no input-delay.
 

SapientWolf

Trucker Sexologist
Wow, this is so wrong it hurts.

Again, the way rendering is done in video games is like a camera being run at 1/200000 shutter speed (that's fractions of a second by the way). It looks completely unnatural. Your monitor cannot produce that kind of motion blur, and even if it could (by running at some obscenely high frame rate, most likely), it'd still be better to add the motion blur as a post process because it's considerably less expensive.

Edit to clarify: Let me put it another way. The reason film blurs is because it is recording an instance in time. If a video is being recorded at 1/24 shutter speed (that's one twenty-fourths of a second), it is going to record all the movement done within that span of time, and all of that movement results in blur.

Your monitor cannot do this.

I'm just going to link this guy: http://www.volnaiskra.com/2012/03/why-more-games-should-support-motion.html
The brain doesn't need motion blurring to perceive the illusion of high speed motion, provided there is enough frames. In fact, most gamers will happily sacrifice motion blur for higher framerates.
 
Every time I go to the Sony store, they have their movies playing at a high frame rate (I'm not sure how high) but it looks pretty silly. Everything moves more full, but it's sped up and looks very unnatural. I'm sure some of it is that I'm not used to it, but at the same time, something is off.
 

superbank

The definition of front-butt.
The "soap opera look" does not look as good as 24fps. Simple as that. 24fps creates an idealized view of the world, it's sort of slower and blurrier, fine details in movement are missed. You'd think people would like more detail but the real world is not what people want to see when indulging in entertainment. I haven't seen the Hobbit but if it looks like a daytime soap opera than people will call it out as looking worse, justifiably so.

For videogames it's different. The view is not being recorded by a camera. There is no light going through a lens past a shutter to capture that moment for a fraction of a second. It's being generated and displayed as is (and maybe with some effects). Movie and game frames can't be compared because the output is so different. Also higher fps is good for responsiveness in games, you don't really need that for a movie though I'm sure someone could use it creatively.
 
I was making the comparison. They do have 240hz tvs.I was coming up with an older game that you could conceivably play right now at 240 fps
Wow. 240hz TVs... wow.

That is something I will probably never buy.

(And not just because Half-Life 2 is one of the only games I have that would actually run at 240 FPS)

The brain doesn't need motion blurring to perceive the illusion of high speed motion, provided there is enough frames. In fact, most gamers will happily sacrifice motion blur for higher framerates.
That's nice, but it's still completely unrealistic. At the very least, it should be an option, one where you can tweak the severity of the effect (which would be the equivalent of changing the shutter angle in film).
 

Tess3ract

Banned
The brain doesn't need motion blurring to perceive the illusion of high speed motion, provided there is enough temporal bandwidth (frames). In fact, most gamers will happily sacrifice motion blur for higher framerates.
Exactly. I will always turn motion blur off. It looks like shit and it reduces the IQ. I can see information fine but if there's tons of blur suddenly I can't see shit.

Also see those crappy vignettes on the four corners of the screen or when you get injured and blood covers the whole fucking thing.

Wow. 240hz TVs... wow.

That is something I will probably never buy.

(And not just because Half-Life 2 is one of the only games I have that would actually run at 240 FPS)
Also 480hz

Recently, at CES 2010 a couple of manufacturers have introduced a 480Hz specification on some LED-LCD TVs. As of this writing all of these specifications are of the LED backlight manipulation variety, as opposed to a true 480Hz refresh rate.
 
The motion blur is easily solved by setting the shutter angle to a lower speed, since shutter angle is separate from frame rate.

It is a conscious choice to increase shutter angle alongside frame rate. It is not an intrinsic factor.

Yeah, I know that much, but it's still a factor in all 48fps material I've seen (including the Hobbit trailer).
 
Exactly. I will always turn motion blur off. It looks like shit and it reduces the IQ. I can see information fine but if there's tons of blur suddenly I can't see shit.

Also see those crappy vignettes on the four corners of the screen or when you get injured and blood covers the whole fucking thing.
I find this interesting. I'm pretty sure you'd be hard-pressed to find a film maker who would be willing to record film at a 1/200 shutter speed.

But of course, yes, playing a game is very different from recording video, so it's a good thing that turning it off is an option for players like you.

Oh, and the 'blood in my eyes!' thing is silly, I agree. A little red at the edges should be good enough. There's a reason there's so much satire of the 'strawberry jam in your eyes' effect. Admittedly, I am a fan of the screen fading the closer towards death you are. Generates a sense of unease and really drives home the fact that, hey, you're about to kick the bucket!

Also 480hz
I'm sure all the surgeons of the world are rejoicing as we speak.
 
High frame rate TV has historically been associated with "cheap" and "amateur" because of its use in lower budget productions such as soap operas, while 24fps has "professional" and "blockbuster" connotations because it has been the preferred standard for most Hollywood films. This kind of "language" of a media format has become encoded into society.

And like any language, it can change. If Hollywood blockbusters started being made in 48 or 60fps, high frame rates would start being associated with blockbuster production quality, and 24fps would look antiquated by comparison - much like how 16fps silent films feel today.
 

Tess3ract

Banned
I find this interesting. I'm pretty sure you'd be hard-pressed to find a film maker who would be willing to record film at a 1/200 shutter speed.

But of course, yes, playing a game is very different from recording video, so it's a good thing that turning it off is an option for players like you.

Oh, and the 'blood in my eyes!' thing is silly, I agree. A little red at the edges should be good enough. There's a reason there's so much satire of the 'strawberry jam in your eyes' effect. Admittedly, I am a fan of the screen fading the closer towards death you are. Generates a sense of unease and really drives home the fact that, hey, you're about to kick the bucket!


Yes, in fact RIFT does the fading the screen thing. It's okay I guess.

But regardless in a game it's all about fidelity for me, not trying to recreate realistic aspects of film. Anything that gets in my way of clarity is a hinderance.
 
Really, it's hard to scientifically test this theory, but I'm certain that if you showed a 30 and a 60 FPS clip to someone who has never seen a movie before there would be no question as to which is preferable.

I think the only issue is coming from us having grown up with 24fps as standard and the next generation will have no issue (if it does indeed take off after the Hobbit).
 

Wonko_C

Member
Also see those crappy vignettes on the four corners of the screen or when you get injured and blood covers the whole fucking thing.

I hate that every other game jumped into the vignette bandwagon, even racing games like Forza Horizon are doing it. Who thought obscuring the corners of my screen and make them round like a tube TV was a good idea? It defeats the purpose of flat-screen tvs.
 

watershed

Banned
60 fps for games looks amazing. 48fps for movies looks like I'm watching an news broadcast or any HD TV with that flow motion crap turned on.
 

Teletraan1

Banned
I think action sequences look infinitely better at a higher frame rate. Panning during dialogue or non action intensive sequences looks really off and almost cheap. It is far too reminiscent of early widescreen movies that were shown on TV as 4:3 that employed a digital pan that would ping pong back and forth during dialogue that created a nauseating effect. This hatred of 60fps are probably just a Pavlovian response to that.

I would like to see some film makers employ a mixed frame rate if that is possible.
 

danwarb

Member
I think it's just a matter of getting used to a much better frame rate in movies.

Low frame rate or a weird FOV are the big causes of motion sickness in games.
 

apana

Member
Stigma because we are not used to it, I guess. For decades color in film was considered something that could only fit with fantasy and cartoons.

You aren't used to it in films either. None of us are and you don't know how the public at large will respond. It could be like color and win out over time or it could never catch on.
 

danwarb

Member
More frames means fewer ugly pans and handheld wouldn't be as painful to view. The sooner people get over what 60fps is currently associated with and it becomes a real option in movies, the better.
 
I mean at SOME point movies have to up their fps.

No they don't.

Games are constantly doing it.

No they aren't.

As someone exposed to 50i footage a lot in a professional capacity, the movement looks like cheap garbage. I'm deathly curious to see what the HFR release of The Hobbit looks like, but I couldn't have lower expectations.

HFR will lower the ugliness of 24fps at uneven pulldowns and thats about it.
 

Aselith

Member
So it's just what people are used to? I mean at SOME point movies have to up their fps. Games are constantly doing it.

What a weird thing to say. And no games sre not constantly upping their framerate. 30 is standard on consoles and 60 is the standard on PC if you can manage it.
 
It's almost a century of 24fps having exclusive reign in motion pictures. I personally would love to see progression to 60fps in movies and games. People will get use it over time. I support the Apple route... cut off 24fps entirely. People will b*tch for about five years then all is well. Yes I know this is unlikely to happen. Too many stubborn people set in their ways.

Also, if high frame rate doesn't add anything to the movie, neither does color nor surround sound.
 
Top Bottom