• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

John Carmack: "Many next-gen games will still target 30 fps"

Then be prepared to be stuck in the 7th generation if you're a console-only player

I mostly play Nintendo + PC games, and Nintendo's console games tend to run at 60 fps (though sadly their 3DS games have mostly been 30), so I should be good on that front.
 
Played bioshock over the weekend.

The 30 vs 60 FPS option was painfully obvious, not sure how anyone can't see the difference.

Wow, does the frame rate change that much? I'm playing that again right now too, I flipped the option on and off and could barely tell the difference in the graphics so I just kept it on the smooth performance setting.
 
A little of a side note, but those with AMD cards looking for some framerate consistency options, the latest preview build of the RadeonPro includes Triple Buffered Vsync, Dynamic Vsync, Double Vsync (half monitor refresh rate), and Dynamic Framerate control (for locking framerate). The app is getting a huge overhaul with tons of new features.
 
Fairly off-topic, but I checked out the NFS:MW demo and I'm pretty appalled that Criterion and EA released that game at 30 FPS, it looks and feels horrible. I hope publishers don't try to move even fast-paced games to 30 FPS in the future.
 
I think a reasonable focus for devs going forward should be on maintaining a stable and consistent frame rate. Doesn't matter to me whether its 30 fps, 60 fps, or somewhere in between. There's nothing more aggravating to me than trying to do something that requires reasonable precision from the player and the frame rate fluctuating between the 10's and the 50's
 
Call of duty?

COD looks pretty good for 50-60fps on consoles. The resolution sucks but for people who don't know anything better they don't realize it. There's some nice artistic work arounds that make the games look good though, better than a lot of 30fps games even.
 
30 fps always looks like shit. No game running at 30 fps can truly be considered "next gen".
Movies are (generally) 24fps, including all the CG ones, i.e. Pixar. No game looks as good as a Pixar film, so clearly there is much more to a generation leap than framerate. Motion blur quality, edge quality, lighting precision, detail levels, etc etc.

I think if a studio were to make a game that looked like a Pixar film and ran at 30fps, nobody would be disputing that it was 'next gen'.
 
John Carmack delivered the (shocking?) news via Twitter:



Those of you who crave 60 fps, PC gaming's got you covered.

Except for most people without a $300 graphics cards can't touch 60FPS on high settings at 1080p on quite a few games. Don't pretend like 60fps is a magic number PC's can always hit.

Sony and Microsoft should make 60 fps mandatory.


Look at the differences between Far Cry 3 on PS3 and PC:

(direct feed, PS3)


To average Joe, the differences - if he even sees any - are definitely not worth a new $500 console.

But one runs at 20 fps, the other at 60 fps.

I really hate when people do this, because you aren't comparing apples to apples. Quite a few people own PC's that can't even run Far Cry 3 at 20fps, let alone High + AA + 1080p @ 60. My 3570k OC'd to 4.2, 8gigs ram, and 7950 can't run 60fps on high with any AA on.. That's a $800+ build. It destroys what my 360 does, but it damn well better for that much money.
 
Pretty much this. For me (and I suppose for a large majority) 30fps was never an issue or detrimental to the gameplay.

Play Street Fighter IV at 30fps, come back and try telling me the same. It is enlightening.

THAT would be fucking awesome.

I really wish this were an option on the PC as well, to be honest. I love the idea of only dropping resolution when a scene becomes too demanding.

There are loads of PC games that run perfectly most of the time but manage to drop in specific situations where this could be a benefit. Would be especially nice when coupled with more aggressive AA. Allow us to use higher quality AA until performance becomes an issue and then drop it temporarily.

With Nvidia implementing the "Adaptive Vsync" option I wonder if they could do "Adaptive Resolution" thing via Control Panel, maybe that's something that has to be implemented on a "per game" basis, though.
 
Play Street Fighter IV at 30fps, come back and try telling me the same. It is enlightening.



With Nvidia implementing the "Adaptive Vsync" option I wonder if they could do "Adaptive Resolution" thing via Control Panel, maybe that's something that has to be implemented on a "per game" basis, though.

There are some games that undoubtedly benefit from having gameplay to the point that it becomes detrimental to have a lower fps. But this doesn't stand true for all games.
 
There are some games that undoubtedly benefit from having gameplay to the point that it becomes detrimental to have a lower fps. But this doesn't stand true for all games.

You said "never", though... OK that was just nitpicking your post. =P

I'd like if all games were 60fps though, even if gameplaywise it doesn't matter. I'm a crack-addict to smooth motion, I even was planning going on a trip to another state just to watch The Hobbit at 48fps. (which fortunately didn't happen because they ended up showing it in my city)
 
That PC screen shot (I can't see the console one) looks like a cartoony style game. I would have never thought this was as serious a game as Farcry if not mentioned by name. People really prefer these bright colors for gritty war themed games?

The first Fry Cry was a violent game in a colorful island setting, and the second was browner but still featured a lot of nice scenery porn. Far Cry 3 is a return to form, and it's not even "war themed" either, as there's no real war going on, just a bunch of disorganized jerks with guns fighting over this or that.
 
30fps, long loading times. All to get a nice screenshot at IGN, while the game keep the shit experience.

You can target 60fps and look good for pictures.

Last gen did this well.

561532_20040226_screen038.jpg
 
Next Gen or, same-ish gen?

Seriously, this whole 'next gen' is slightly more powerful than last gen bullshit is getting really old. You'd think they could do a lot more in the 8 year hardware cycle.

Hell, you could get 60fps @ 1280x960 in 2005 hardware in games like F.E.A.R. with 4x AA on the pc. The hardware at the time was top end, but that hardware today would be pocket change.

It's amazing how far behind console tech is. I realize that they have to have a standard so dev kits can know what to expect, etc. but shesus when was this bar set? 2006?
 
That's fine with me quite frankly. I would rather the non-console exclusives target a lower framerate and higher resolution anyway.
 
You can target 60fps and look good for pictures.

Last gen did this well.

561532_20040226_screen038.jpg

Heh you know that is interesting when I come to think of it, NG/B ran at 60 fps whilst still being one of the best looking games of that generation. Funny how that works.
 
most people dont even know what they like about COD. They know it feels better and they prefer it but they dont identify it as a better framerate. COD never advertises that as a selling point.

I would love to see the reaction to a 30fps call of duty from the masses. would they notice? would they stop playing?
 
Heh you know that is interesting when I come to think of it, NG/B ran at 60 fps whilst still being one of the best looking games of that generation. Funny how that works.
That was common last generation, actually. The first and best example for me has to be Metal Gear Solid 2.

60 fps was extremely common last generation for AAA and budget titles alike. Even genres in which it doesn't seem necessary often delivered 60 fps and most of those games were Japanese made.

The typical Western XBOX game definitely didn't aim for 60.
 
i think this generation struggled with the upped resolution.

1080p is going to be more in tune with next gen hardware than 720p was this gen. its going to be much more balanced.
 
All I care about is that most of them are native 1080p. The resolution gap has always been the biggest advantage for PC gaming and that'll be gone if we get 1080p on consoles.
 
last gen was better than this imho.

this is the second worst generation after 32bit one.

I think next gen will be much better.

Yup, the PS2/ GNC/ Xbox 1 generation of consoles generally supported 480p and could get some pretty comfortable framerates at those resolutions. The jump to 720 & 1080p are what caused a lot of issues for the current generation of consoles (outside of the Wii). There are a lot of PS3 and 360 games out there that have trouble maintaining a good framerate at 1280×720 and run at resolutions below that.
 
You say that as if they were alone. Most top Japanese developers took this approach. Team Ninja were definitely among the best, however.

I would say KojiPro too except they pushed PS2 too hard with MGS3.

Capcom made a ton of great looking 60fps titles on PS2 like Team Ninja.

We did have a few 60fps Western titles on the Xbox like this one:

dtr_screen001.jpg


Dead to Rights.
 
All I care about is that most of them are native 1080p. The resolution gap has always been the biggest advantage for PC gaming and that'll be gone if we get 1080p on consoles.
Yeah, but to be fair, this generation was far and away the best in that regard. Resolutions close to 1280x720 are still common in the laptop space, for instance, and are perfectly suitable for decent UIs and smaller text.

Also, on the right display from a moderate viewing distance, 720p can still look very clean unlike 640x480 or lower (on the same type of display).
 
Movies are (generally) 24fps, including all the CG ones, i.e. Pixar. No game looks as good as a Pixar film, so clearly there is much more to a generation leap than framerate. Motion blur quality, edge quality, lighting precision, detail levels, etc etc.

I think if a studio were to make a game that looked like a Pixar film and ran at 30fps, nobody would be disputing that it was 'next gen'.

and I think that if a studio released a Pixar quality film at 60 fps that nobody would say it looked as good as 24 or 30 fps.

and that's ignoring that controller response is always going to be effected by 30 fps vs 60 fps. even if you poll the controller at 120 times a second, the images on screen will still have taken twice as long to get there, and you can take that difference in time and add it onto your reaction speeds (and your display lag, etc).

60 fps provides optimal controller response, which isn't something Pixar have to worry about when they make their 24 fps films. wanting games to control as well as possible is a pretty sensible desire.
 
I really hope more devs use a dynamic framebuffer.

It really saved Doom 3 on consoles and RAGE to keep a high framerate.

If Crytek gave a damn, Crysis 2 on consoles could have performed with a much more stable framerate.
 
Yeah, but to be fair, this generation was far and away the best in that regard. Resolutions close to 1280x720 are still common in the laptop space, for instance, and are perfectly suitable for decent UIs and smaller text.

Also, on the right display from a moderate viewing distance, 720p can still look very clean unlike 640x480 or lower (on the same type of display).

It was better this gen for sure, but it's still a huge deal unless you sit 10 feet away from your monitor/TV. I guess because I game up-close on PC exclusively it's more apparent. 720p is brutal for me.
 
Actually the real concern is will developers make sure that they can at least maintain a stable 30fps? Honestly a smooth frame rate is more important than 60fps. I am sick of playing slide shows.
 
It was better this gen for sure, but it's still a huge deal unless you sit 10 feet away from your monitor/TV. I guess because I game up-close on PC exclusively it's more apparent. 720p is brutal for me.
I think it really does depend on the display more than anything else.

720p looks absolutely abhorrent on my 2560x1440 LCD. I mean, it is just plain awful looking and unplayable for me. On my Kuro plasma in the basement though? 720p actually looks surprisingly crisp and clean. Jumping to 1080p definitely results in a superior image, but the difference is much less significant to the point that I can stomach lowering PC games as low at 720p if it means a stable framerate. On the LCD, I just can't deal with it.

If you're using to sitting at a desk with a monitor in your face then 720p isn't enough especially if said monitor is an LCD of some sort. 1280x720 on a good CRT monitor can look more attractive than double the resolution on an LCD in many ways.
 
Actually the real concern is will developers make sure that they can at least maintain a stable 30fps? Honestly a smooth frame rate is more important than 60fps. I am sick of playing slide shows.

It's a result of this gen going on too long. A lot of devs are tired of making games tailored to this gen's consoles so they make a game powerful enough for the PC to display and port down.
 
If it's a choice between 30fps and extra eye candy or 60fps but paired back. I'll take 30fps every time.
 
From an average distance of 10' away do we really need 1080p? what about developers doing 900p sort of like how we got a bunch of 600p games this gen.
 
Top Bottom