• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Justifying 30 FPS As An Artistic Choice

If by artistic purposes you mean "make the game look as good as possible", fine. Don't say bullshit like cinematic. Just say you're willing to let gameplay suffer a bit for the improvement of graphics. The general public is totally fine with it.
 
Contradictions runs at 24 fps and it's fully justified. The cinematic excuse even works in their case!

( ͡° ͜ʖ ͡°)
 
This is practically the only time I think a game would actually be worse with a higher framerate.

Not true at all.

They could still animate everything at 24fps to achieve the same look as the show, but have the game run at 60 (or more on PC). A smooth scrolling background would not make the game look less like the show.
 
There are a few times where it actually improves the aesthetic of the game. South Park or Claymation/stop motion animated games.

That's about it.
 
I'm not in the crowd that thinks every game should be 60 FPS. Honestly, I wonder if a game like P.T. would have the same effect if it was running at a full 60 FPS. The higher framerate would probably make the game feel more "video gamey" than it probably should.

It should be 'video gamey' because it's a video game.
 
I'm not in the crowd that thinks every game should be 60 FPS. Honestly, I wonder if a game like P.T. would have the same effect if it was running at a full 60 FPS. The higher framerate would probably make the game feel more "video gamey" than it probably should.

Hmm what's the reasoning behind that? Lower fps = less "video gamey"?
I'm quite sure of the opposite, PT would be a lot more terrific in 90fps VR. Games tend to emulate reality not movies so the lower fps thing makes no sense.
 
If by artistic purposes you mean "make the game look as good as possible", fine. Don't say bullshit like cinematic. Just say you're willing to let gameplay suffer a bit for the improvement of graphics. The general public is totally fine with it.

Yeah. Devs don't "choose" FPS, they choose better graphics. Better graphics are the cinematic part, the framerate is what's sacrificed to achieve it.
 
Not true at all.

They could still animate everything at 24fps to achieve the same look as the show, but have the game run at 60 (or more on PC). A smooth scrolling background would not make the game look less like the show.

On TV scenes would be scrolling at 24 fps. The only reason Southpark is at 30 is due to refresh rate synchronization.
 
the gaming community wanting 1080p + 60FPS on consoles $400 and under that also have to run a bloated OS in the background is a classic example of having your cake and eating it too.
 
If by artistic purposes you mean "make the game look as good as possible", fine. Don't say bullshit like cinematic.

I think that's just semantics to be honest.
You can put "cinematic" on the back of a box and in your marketing. You can't really put "we made the game look as good as possible" in a press release.
 
I think devs can make stuff at whatever framerate they want. There shouldn't be a "standard". And if you find that to be wrong, don't buy it.

JJtpAAr.gif


I agree with this completely.
 
What do you expect them to say then?

"Yeah, these consoles are complete pieces of shit. if it were up to me, It'd be 60fps. Fuckin' shitty consoles. Play this shit on PC instead, guys. Performance and graphics, the PC's got it all. Fuck consoles."

- Game dev edgelord
 
frame rate is a key factor in gameplay however there are many games where the focus isnt on gameplay and more for story or for a specific aesthetic a smooth 30fps may be in those situations better than having a fluctuating framerate with a cap of 60.
 
I think that's just semantics to be honest.
You can put "cinematic" on the back of a box and in your marketing. You can't really put "we made the game look as good as possible" in a press release.

In which case say nothing. People that know what fps a game should run at (or at least pay attention) aren't going to be convinced by such bs, people that don't know don't care.
 
It's definitely defensible from an artistic point of view: 30 FPS has a particular look and feel and it's possible that someone directing a game wants that specifically.

It has drawbacks in terms of functionality, however, which can make it a harder decision to justify in other areas.
 
I suppose you could make a case for it "artistically" if you really mean "graphics".

Which is how I read this type of PR these days.
 
Even Naughty Dog knows the "cinematic framerate" is bullshit. The Last of Us Remastered and Nathan Drake Collection are immensely superior experiences at 60. If the leading developer of cinematic games understands the benefits of 60fps, there really isn't any excuse for the rest of them.

I don't have any problem with devs choosing to target 30 for improved visual effects over 60, but don't try to justify it with some "cinematic experience" bullshit, is all.
 
They should stop with this excuse already, everybody knows the real reason is that console hardware isn't that good and they rather have a better looking game with good effects etc than a smoother 60fps gameplay
 
I just don't care anymore. Release it at any framerate above 30. Just make sure it doesn't dip below 30 like all the big AAA games have been doing since 2006.
 
Just say you can't hit 60 and it's fine, no need to insult the intelligence of your consumers. The Order devs were incredibly dumb about it.

There's nothing wrong with a game running at 30fps. It doesn't need to be "justified."

It's objectively worse than 60. This is a fact.
 
the gaming community wanting 1080p + 60FPS on consoles $400 and under that also have to run a bloated OS in the background is a classic example of having your cake and eating it too.

To be fair, all three consoles are perfectly capable of 1080/60. But devs rarely sacrifice graphics for framerate.
And if they try, they get the marketing department on their case because framerate isn't visible in screenshots.
 
The fact it's even an argument seems to imply there is a difference of choice between 30 and 60.

It really seems to be a matter of preference once you get technical limitations out of the picture. Like how some people hate the High Frame Rate Hobbit Movies, or I can't stand motion smoothing,

It should always be an option to switch to 30 FPS if you prefer it.
 
Ico is a good example of framerate being a part of establishing the mood. Surreal and dreamy game with motion to match. 60fps Ico is cool, but I wouldn't want that above the original fps.
 
I assume nearly everyone on GAF enjoys HFR films then yes? I mean it's literally the reverse problem lol, all the power in the world to create high framerates, and choosing to display otherwise.
 
What difference does it make whether they say it or not?

What is being put forward by the OP is that they needn't use the "cinematic" to justify the frame rate. Especially in the case of most games where the PC is running unlocked (which is also true of Layers of Fear).
 
The "cinematic" thing is pretty much marketing speak though. But we experience so much stuff at 30fps including video games for sometime now that we're used to it. I'm just surprised any dev is trying use that to justify something they don't really need to. "Yeah our game runs at 30 frames per second... just like a bunch of other games for the last 20 years."
 
How about you pay attention to what is being discussed here.

I am, frame rate and resolution topics, do tend to end up teetering on the edge.....for what it's worth the devs probably end up justifying it to themselves that it's a cinematic choice, like Need for Speed. Capping a racing game on PC to have parity with the console version for artistic reasons or whatever was just crazy.
 
Wait, why do you think devs wouldn't genuinely want to hit a framerate that is cinematically more accurate to what is used in actual films?

One big reason: there is no controller input when you are watching a movie.

Cutscenes or something like that would be fine.

A big part of the problem is that devs look at films and think there is some checklist to the "cinematic look" or experience.

24 fps? Check!
Motion blur? Check!
Bokeh/no depth of field? Check!
Chromatic abberration? Check/vomit!

Aaaand it's basically the same checklist that every single budding/amateur filmmaker uses to try and get a "cinematic look", ignoring more important stuff like nuanced camera movement, lighting, blocking, etc. And that's all with ignoring the script!
 
I think that's just semantics to be honest.
You can put "cinematic" on the back of a box and in your marketing. You can't really put "we made the game look as good as possible" in a press release.

Nobody is putting "cinematic framerate" on their box. It's when they discuss framerate and start throwing around terms of like cinematic. It's dumb and an unwillingness to admit a trade off they made.
 
Wait, why do you think devs wouldn't genuinely want to hit a framerate that is cinematically more accurate to what is used in actual films?

Because video games are not films

Even Naughty Dog knows the "cinematic framerate" is bullshit. The Last of Us Remastered and Nathan Drake Collection are immensely superior experiences at 60. If the leading developer of cinematic games understands the benefits of 60fps, there really isn't any excuse for the rest of them.

I don't have any problem with devs choosing to target 30 for improved visual effects over 60, but don't try to justify it with some "cinematic experience" bullshit, is all.

amen

TLOU:R feels objectively far better at 60fps - even the cinematics look far better at 60.
 
Why do people keep talking about film in framerate discussions. Videogames and films have nothing in common technically speaking. Not even the concept of framerate is the same.
 
Alas, the idea of 30fps as an artistic decision is mostly nonsense. You can pump out slightly shinier visuals that way, but that's about the only visual benefit you get.

To be fair, 30fps can look fine, especially in slower paced games where you're not expecting a lot of twitchy motion. However all else being equal 60fps looks just a little bit better due to motion appearing a bit smoother.

You could try to make an argument about going for a film like experience, but games aren't movies, and a lower frame rate isn't going to make a game look like a movie.
 
People need to slow their roll on this one.

The layers of fear guys weren't talking about 60 vs 30. They were talking about a capped (and presumably stable) frame rate vs constant fluctuation.
 
While I generally agree with the OP, I will relay a tiny anecdote.

I have discovered that I cannot play Project Diva in 60 fps. For those that don't know, most PD console games are 30fps, but a few games in the "Dreamy Theatre" sub-series (of sorts) are 60 fps.

I've been playing Project Diva for about two years, and bought my first dreamy theatre game last summer, largely because I was excited at the prospect of 60 frames. To my disappointment, I discovered that the higher framerate made the notes move "strangely", to my eyes. I'm sure that it's definitively better, but I was just too used to the lower framerate.
 
Somewhat related. Had my art teacher tell us today something along the lines of 'The eye has around 60 million pixels and can look around at 24fps!'

That was one hell of a statement.
 
I can actually understand the cinematic feel thing, though it can't just be frame rate, you need the blur and steady, constant frame pacing too to achieve it. If it's just 30fps alone it's simply shoddy workmanship.

Thankfully, I'm not fussed as long as it's consistent. Having been playing Rise of the Tomb Raider over the last few days, and fiddling with the settings to compare a locked 30 and 60 fps, I've come to the conclusion I can't really say I find it that noticeable a difference, and am happy to trade 60fps for distinctly nicer visuals.
 
Top Bottom