• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Can we please stop with the whole "60 fps is not cinematic" argument.

60 is way too many.
This is true. Fewer frames also help to provide a sense of mystery - that you're literally missing something, at the margins, and this vacuum subconsciously cues your mind to fill in the blanks from within your own active imagination, which actually increases engagement and story investment. Its a technique called closure.

And yes, if you've been paying attention you've already realized that I am entirely full of shit.

ALL games are better at 60.
Truth, all else being equal. Unfortunately all else is not equal, so choices are made that end up giving us a ~30fps standard on consoles.
 

Myansie

Member
The cinematic posts in the last of us threads are irritating me no end. That look of a low frame rate that is being cheered is blur or strobing (depending on the shutter speed). It's not much different from a low resolution and is not good to look at. Artistically low frame rates can be used for the same reason you might choose to shoot on 16mm or video. It gives a nostalgic look and will blur your image to hide the issues with your production design. There's little point in 1080p without 60fps, the clarity gained from the res jump is lost with any kind of movement. In fact at 1080p it's clear enough you can easily see the blur from 60fps. A game or movie's frame rate is a compromise in the vast majority of situations. I'm in the camp hoping James Cameron goes a little bonkers and shoots the Avatar sequels at 120 fps.
 
Agreed. I always argued about the Hobbit's 48fps. It only looks "like a soap opera" because that's what we're conditioned to believe about HFR content because of a history of budget TV shows being recorded in low quality but high frame rate interlaced video. If every Hollywood film switched to 60fps, 60fps would be the new "feel of film" and 24fps would seem dated and stuttery - much like 16fps silent film seems to us today.
 
Not really. Those are subjective statements, while saying 30 fps is better than 60 in a video game is simply untrue. They are hyperbole, but I personally don't play games that have both of those qualities. 900p30 should be the absolute bare minimum for any game now.

It's not untrue. It's simply different. 60fps gives you better fluidity and response time with a major hit to how much ram you can access. 60fps is different than 30, not just automatically better. It's all about the pros and cons and what your personal preference is.

Also, why aren't movies filming in 60 instead of 48. Seems counter productive.
 

Loofy

Member
30fps does look more cinematic. Thats a fact.
If it wasnt then why arent developers using 60fps for FMV scenes?

That said if I had to choose a more cinematic experience takes backseat to more fluid 60fps gameplay.
 

Jedi2016

Member
I'm going to be that annoying guy who quotes the dictionary.

Full Definition of CINEMATIC
1: of, relating to, suggestive of, or suitable for motion pictures or the filming of motion pictures <cinematic principles and techniques>
http://www.merriam-webster.com/dictionary/cinematic

Since a vast majority of films are done in 24 fps, I would say that 60 fps is not very cinematic.

Not that it really matters. 60 fps does not have to be better at everything than 30 fps. 60 fps is used to make a game look smooth and silky, and I would say that it does its job very well.
I saw the Hobbit at 48fps. In the cinema. Boom, cinematic.

Frankly, I don't really give a shit. But I think the only people that are genuinely using the "cinematic" argument are the ones defending a game for being limited, for whatever reason, to 30fps.
 

Jtrizzy

Member
FYI, higher framerate does not equate to stutter free. 30fps locked would be "jutter-free" while 60fps unlocked wouldn't be.

Games are rarely "locked", unless we're talking about Nintendo. A dip below 30 is terrible, a dip from 60-50 or 55 is far less jarring.
 
Why do I want a game to be 'cinematic' in the first place, whatever that means. It implies a lack of agency, while high frame rates improve responsiveness. All games are better at 60 frames or more.
 

Squire

Banned
I wish people would stop holding on to low framerates being cinematic in general. 24fps shouldn't be acceptable for.... anything.

You people are the reason no theaters around me were showing the Hobbit in 48fps! Embrace the future, you luddites!

The real reason no theaters around you were showing that version is more likely because it was a stupid idea to begin with.

Film needs to be in 24fps.
 

mechphree

Member
Games are rarely "locked", unless we're talking about Nintendo. A dip below 30 is terrible, a dip from 60-50 or 55 is far less jarring.

Exactly. A dip of 5 or even 10 frames on 60 is nothing. You probably couldn't even tell the difference. Those dips though get magnified in 30 and under land. A 10 fps drop from 30 to 20 is definitely jarring to the experience. Most ND don't even have locked 30 frame rates. I would rather 60 unlocked then 30 most definitely
 

Ivan

Member
Putting responsiveness aside, every frame rate has some form of aestethic that comes with it. 30 fps does look more "cinematic", because it "looks like a movie" in it's speed and juddering if you will. Today's movie, no need for complex theories from the future. It doesn't mean that it gives better and more responsive gameplay, but it looks different.

Games don't have natural motion blur we get from cameras while filming, so 60 fps game should have high quality motion blur, it would make games in that frame rate less "sterile" and more "cinematic" aesthetically, but developer must know how to do it properly. 30 fps games should have that, too. It would be even more cinematic.

That's the main difference in frames from the movies compared to games - motion blur. In games we have "perfect" frames without motion blur, and if you press pause to see individual movie frame, you'll see a lot of motion blur. It makes everything smoother to our eyes in motion, so we need that in 30 fps gaming too.

60 fps without motion blur is cool, but it is incomplete for certain games IMO (aesthetically)
 

wildfire

Banned
I wish people would stop holding on to low framerates being cinematic in general. 24fps shouldn't be acceptable for.... anything.

You people are the reason no theaters around me were showing the Hobbit in 48fps! Embrace the future, you luddites!
There are a bunch of drawbacks to 48 FPS. It becomes a lot harder to hide the makeup and fakeness of set pieces. The tradeoff is that 48FPS makes CGI a lot better though.
 
Games are rarely "locked", unless we're talking about Nintendo. A dip below 30 is terrible, a dip from 60-50 or 55 is far less jarring.

But still, it's silly to say that 60fps unlocked is smooth as butter or something.

If a game stays locked at 30fps, it's going to look smoother because it will divide evenly into most display refresh rates (60Hz, 120Hz, etc). A game that constantly fluctuates between 40-60fps is going to look worse than a game locked at 30fps since there will be an extra frame or two that repeat because it won't refresh evenly.
 

styl3s

Member
Games are not film, and our eyeballs enjoy more frames.
My eyeballs did not enjoy The Hobbit at 48 frames.. Like i walked out 15 minutes into the movie that's how much i hated it.

But yes your eyeballs do enjoy more frames on video games but 30/60 doesn't matter to me as much as it does some GAF duders.
 

TnK

Member
IMO, the sooner this "cinematic experience" trend gets out of gaming, the better.

I really dislike the push to make games like movies.

60-120 FPS for everything all the way!
 
I'm going to be that annoying guy who quotes the dictionary.

Full Definition of CINEMATIC
1: of, relating to, suggestive of, or suitable for motion pictures or the filming of motion pictures <cinematic principles and techniques>
http://www.merriam-webster.com/dictionary/cinematic

Since a vast majority of films are done in 24 fps, I would say that 60 fps is not very cinematic.

Not that it really matters. 60 fps does not have to be better at everything than 30 fps. 60 fps is used to make a game look smooth and silky, and I would say that it does its job very well.

"cinematic" is more in reference to the pacing of action, cinematography, detail, animations, Etc. anyone who thinks this in regards to games has been tricked into believing framerate contributes to a cinematic feel.
 

wildfire

Banned
FYI, higher framerate does not equate to stutter free. 30fps locked would be "jutter-free" while 60fps unlocked wouldn't be.


Actually frame locking doesn't necessarily mean the frame rate is fixed at 30. It should but devs and publishers have abused that definition. There are drops now.

The only studder free games with 3D graphics are only made by Nintendo and at 60 FPS.

What does Pixar render in?
Until The Hobbit everyone did 24 FPS.
 

DrNeroCF

Member
But IMO more fps = more life like and indicative of how we see real life.

So what you're literally saying is, more FPS is less cinematic?

I don't know, I guess it just depends on what your priority is. With online play and the unwashed masses not even knowing what Game Mode is, very few games nowadays are even made with responsiveness in mind.

I really love 60 games (despite hating the 'smooth motion soap opera effect' on live video), but I think a game running at 30 fps with a fantastic implementation of per object motion blur would look far more stunning.

Frame judder is way worse than a solid 30 fps though, IMHO.
 

dallow_bg

nods at old men
People associate 24-30 FPS with "cinematic" because of the limitations of film. It's the same reason people balk at higher frame rates for movies and TV, and decry them as "soap opera-like." There's nothing inherently better about lower frame rates. There's nothing inherently worse about higher frame rates. In games, film, or TV. We've simply become conditioned that lower frame rates equal higher quality media (more so with film than anything else). It'll take a while for that bias to dissipate, but I think it will over time. Personally, I really like even interpolated higher frame rates for visual media. The Walking Dead in 240 Hz is like watching a documentary about zombies. Or even more like being more directly involved with the action. It's kind of awesome. I get why there's resistance, but the reason for the resistance is counter-intuitive. It's kind of like "We can't do it that way, because we've always done it this way." Which I consider the worst argument one can possibly make.

Yuck. Interpolated tv and movies look terrible and do give that unrealistic fast motion look.
Not representative of that actual higher frame rate media looks like.
 

Jtrizzy

Member
It's not untrue. It's simply different. 60fps gives you better fluidity and response time with a major hit to how much ram you can access. 60fps is different than 30, not just automatically better. It's all about the pros and cons and what your personal preference is.

Also, why aren't movies filming in 60 instead of 48. Seems counter productive.

All things equal, and we are talking about a last gen port here, anyone who says they'd rather have 30 is out of their mind.
 

mechphree

Member
So what you're literally saying is, more FPS is less cinematic?

I don't know, I guess it just depends on what your priority is. With online play and the unwashed masses not even knowing what Game Mode is, very few games nowadays are even made with responsiveness in mind.

I really love 60 games (despite hating the 'smooth motion soap opera effect' on live video), but I think a game running at 30 fps with a fantastic implementation of per object motion blur would look far more stunning.

Frame judder is way worse than a solid 30 fps though, IMHO.

More life like for me means more immersive and more cinematic to me. I feel drops of 10 or so frames on 60 isn't very jarring to the experience, but those same drops in 30 and lower are really bad. I love TLOU but that doesn't excuse the fact it sometimes runs like shit. I feel 60 fps would be more life like to me,,,
 

Loofy

Member
Frame judder is way worse than a solid 30 fps though, IMHO.
Now heres an opinion that I find odd.
When did everyone get so sensitive to frame judder?
I find it hard to believe that in all these years PC gamers have been locking their games to 30fps instead of playing with the usual average of 40-55fps or something.
 
I don't understand how adding visual information is less cinematic. 24fps isn't even that good in movies, especially action ones. Of course this is being said generally, but 24/30fps does zero for me. There is nothing impressive about it, and it stifles good animation.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Watching != interacting. Passively observing a linear visual sequence has you perceive framerate pretty differently to interacting with a non-linear visual sequence. The former requires little investment, and allows your brain to relax and comfortably find patterns in "24 frames per second" data to form a cohesive visual sequence. The issue interactive mediums introduce is fact we're not longer casual observers; we've a natural expectation of patterns not just in visuals but visuals as feedback for interactivity. We don't "run" at a low framerate, or any framerate, so our interactivity marked with low framerate feedback can lead to a weird dissonance. In almost all scenarios your brain can and will appreciate more frequently updated visual feedback to your physical input, as that is what we're used to simply via our own existence.

An easy experiment is to play a game at 60fps and 30fps while recording footage, and going back and watching that footage. In the former scenario the footage will almost always seem oddly smooth and fast, moreso than you remember while playing. Yet playing the latter may lead to frustration as you try to line up shots and move your character, yet the footage will seem perfectly cohesive and smooth.

Watching is a passive experience: your brain only needs to find patterns in the visual data and that's it. If it works, it works, and everybody is happy. Interaction is different, your brain not passively finding visual patterns, but attempting to correlate those patterns to your own deliberate physical input that is not bound by "framerates".

An argument that gameplay should be 60+fps and cutscenes 30fps is another matter entirely.
 

ampere

Member
TLoU was a great game, but the fps was awful. It felt like I was playing a PC game on settings it couldn't handle. Big reason why I chose PC for games when possible.

I'd actually consider double dipping if the re-release is 60 fps, or close to it.
 

mechphree

Member
Watching != interacting. Passively observing a linear visual sequence has you perceive framerate pretty differently to interacting with a non-linear visual sequence. The former requires little investment, and allows your brain to relax and comfortably find patterns in "24 frames per second" data to form a cohesive visual sequence. The issue interactive mediums introduce is fact we're not longer casual observers; we've a natural expectation of patterns not just in visuals but visuals as feedback for interactivity. We don't "run" at a low framerate, or any framerate, so our interactivity marked with low framerate feedback can lead to a weird dissonance. In almost all scenarios your brain can and will appreciate more frequently updated visual feedback to your physical input, as that is what we're used to simply via our own existence.

An easy experiment is to play a game at 60fps and 30fps while recording footage, and going back and watching that footage. In the former scenario the footage will almost always seem oddly smooth and fast, moreso than you remember while playing. Yet playing the latter may lead to frustration as you try to line up shots and move your character, yet the footage will seem perfectly cohesive and smooth.

Watching is a passive experience: your brain only needs to find patterns in the visual data and that's it. If it works, it works, and everybody is happy. Interaction is different, your brain not passively finding visual patterns, but attempting to correlate those patterns to your own deliberate physical input that is not bound by "framerates".

An argument that gameplay should be 60+fps and cutscenes 30fps is another matter entirely.

I agree . 30 is fine for watching but 60 IMO is better for interacting with your media.
 
Watching != interacting. Passively observing a linear visual sequence has you perceive framerate pretty differently to interacting with a non-linear visual sequence. The former requires little investment, and allows your brain to relax and comfortably find patterns in "24 frames per second" data to form a cohesive visual sequence. The issue interactive mediums introduce is fact we're not longer casual observers; we've a natural expectation of patterns not just in visuals but visuals as feedback for interactivity. We don't "run" at a low framerate, or any framerate, so our interactivity marked with low framerate feedback can lead to a weird dissonance. In almost all scenarios your brain can and will appreciate more frequently updated visual feedback to your physical input, as that is what we're used to simply via our own existence.

An easy experiment is to play a game at 60fps and 30fps while recording footage, and going back and watching that footage. In the former scenario the footage will almost always seem oddly smooth and fast, moreso than you remember while playing. Yet playing the latter may lead to frustration as you try to line up shots and move your character, yet the footage will seem perfectly cohesive and smooth.

Watching is a passive experience: your brain only needs to find patterns in the visual data and that's it. If it works, it works, and everybody is happy. Interaction is different, your brain not passively finding visual patterns, but attempting to correlate those patterns to your own deliberate physical input that is not bound by "framerates".

An argument that gameplay should be 60+fps and cutscenes 30fps is another matter entirely.
Perfectly put.
 

RagnarokX

Member
This reminds me of a couple of weekends ago when I helped my half brother move out of his dad's house. His dad got a new HDTV since my brother's family was taking their TV with them and the features made regular TV look like it was smoother than 24fps. My brother complained that it looked "cheap" and fiddled with the remote until he found Game Mode, which he left on.
 

ChawlieTheFair

pip pip cheerio you slags!
Our tvs have been 60hz for an eternity, 60 fps is ideal for display reasons.

That said, I'm a huge supporter of locked 30FPS to enjoy all the bells and whistles that double the frame time can allow.

As long as you can hit a consistent 30, 60, or 120 I'm all for it.

This, you can't miss a higher framerate in a game if you never experienced it in the first place. People always claim a certain framerate is needed for whatever genre of game, like FPS or racers. Fact is it usually isn't, and if you haven't seen 60 for a particular game, more than likely 30 is totally fine. If for whatever reason you can get 60 and then switch back to 30, say on a PC game, then I see how one would desire it. Regardless, it's a visual thing for MOST people, for some its controls. It's no argument that 60 fps both controls and looks better in terms of fluidity, but like you I honestly don't care as long as it's constant. While every game having 60 would be amazing, it's simply illogical to expect that, as every developer/publisher wants different things out of their games, i.e more graphical prowress or more fluid gameplay.
 

beril

Member
Are people still arguing about the hobbit? The framerate was the only redeeming factor about those movies. 24fps is terrible, although much less important in movies. Higher framerate is always better in any medium, though sometimes it may not be worth the trade-off
 
This reminds me of a couple of weekends ago when I helped my half brother move out of his dad's house. His dad got a new HDTV since my brother's family was taking their TV with them and the features made regular TV look like it was smoother than 24fps. My brother complained that it looked "cheap" and fiddled with the remote until he found Game Mode, which he left on.

Good, frame interpolation (and any of those effects that modify the signal on screen) should be avoided at all costs.
 

Ivan

Member
24 fps is cinematic. Games just shouldn't be cinematic.

Nice, but that's another subject. We are talking about how something looks here, not how it feels to play.

We are used to 24 fps for certain type of content. Its debatable if that's right or wrong, but i don't think we can change it easily.
 

ChawlieTheFair

pip pip cheerio you slags!
Are people still arguing about the hobbit? The framerate was the only redeeming factor about those movies. 24fps is terrible, although much less important in movies. Higher framerate is always better in any medium, though sometimes it may not be worth the trade-off

Maybe. I was personally reminded of TV soap operas when watching the hobbit, and asides from a few action scenes, I thought the higher framerate was degrading to the experience. Could just be because we are all so use to 24 fps for movies.
 

Kuro

Member
The Hobbit in 48 is pretty jarring. You really have to get used to it and otherwise it looks like a video game with real people on top of it.

A high frame rate across everything doesn't make all formats better. Some work and some don't.

That is more a case of people being so used to 24fps films that 48fps makes things look more "real". Once people get used to it the benefits will be great like better CGI integration in live action and easier to follow movement during high action scenes.
 
Top Bottom