• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

John Carmack: "Many next-gen games will still target 30 fps"

I'm really pulling for HFR in movies to do well financially AND creatively. 24 fps and 30 fps are just things we're used to. that's the only reason we put up with them. The Madden example is a good one. When a series that had been reliably 60 fps doesn't manage it, people aren't happy.

When COD BLOPS wasn't as smooth on PS3 as it was on 360 (and in previous games) people weren't happy.

Playability is no minor thing to sacrifice for shinier rocks, or whatever. not every game benefits a lot from 60 fps, but every game benefits somewhat from it. it never makes a game look or play worse.

is the tradeoff always worth it? dunno, but generally I'd take the 60 fps personally. if HFR becomes a thing, less and less people are going to put up with 30 fps.
 
Shocking news. Why do people expect this with every new console cycle? Devs use extra power to push extra graphics, not frame rate.

And there probably will be sub HD resolutions as well as we get further into the gen. I'm honestly not that interested in next gen yet since my PC can already achieve Crysis 2 on Ultra Level @ 60FPS.
 
Playability is no minor thing to sacrifice for shinier rocks, or whatever. not every game benefits a lot from 60 fps, but every game benefits somewhat from it. it never makes a game look or play worse.

I'm pretty sure you'll find some on GAF who say it otherwise.
 
And there probably will be sub HD resolutions as well as we get further into the gen. I'm honestly not that interested in next gen yet since my PC can already achieve Crysis 2 on Ultra Level @ 60FPS.

Sub 1080p, yes, but I'd be shocked if a game went below 720p in the life of the next Xbox. Most games on 360 are native 720p, so I don't see a game next-gen below 720p.
 
Sub 1080p, yes, but I'd be shocked if a game went below 720p in the life of the next Xbox. Most games on 360 are native 720p, so I don't see a game next-gen below 720p.

I do. Mainly because I don't see games that are sub 720p failing to sell. It'll be rarer, but it's going to keep happening.
 
Shocking news. Why do people expect this with every new console cycle? Devs use extra power to push extra graphics, not frame rate.

Poor ratchet and clank, pushed frame rate and didn't get the props for it enough. The masses want shiny graphics not 60 frames
 
you're right, but again only because they've been wired to think 24 fps is 'cinematic'. that's already started being chipped away by The Hobbit... give it time.

I still find it funny how games like Uncharted is considered "cinematic" when it runs in 30fps.

With that framerate it's about as "cinematic" as CSI: Miami.
 
WiiU included, but I mean as a whole, it seems that the gap in power between console generations seems to be shrinking or stagnating. Mainly this is in regard to the 30/60fps argument going on here. I'm just disappointed that we're still talking about 30/60fps for "next gen" consoles. To me it should have been a natural progression to higher frame rates and visual fidelity. It just seems flat out lazy that we're sacrificing one for the other at this point, since when you move on to the next generation you should be making leaps not babysteps. It guess it's just something that console-only folks don't care that much about, which is why they're not people.
Again, going from 60 to 30 FPS means you can more than double the amount of graphical niceties you can put onscreen at once.

There are always going to be MAJOR compromises made to achieve constant 60 FPS, and the developers just don't agree that they're worth it.
 
Bull fucking shit. I don't believe you. Even the 8800GT that I bought two years later couldn't do that completely smoothly let alone the 7000 or 6000 series cards.

I just pulled that from Tom's, it's an 1900XFX which is what I was using in 2005

http://www.tomshardware.com/charts/desktop-vga-charts-2005/F.E.A.R.,646.html

and your 8800GT better damn well have done that or at least been close, unless there was something horribly wrong with your rig. Though it may have been just under 60 for a GT, I had an ultra at the time, but I honestly can't remember what it ran at. Fear was way too demanding for what it looked like though, you are right.

I said it was high end hardware.
 
Again, going from 60 to 30 FPS means you can more than double the amount of graphical niceties you can put onscreen at once.

There are always going to be MAJOR compromises made to achieve constant 60 FPS, and the developers just don't agree that they're worth it.

there are major compromises made by going to 30 too.

and people are going to start wanting more FPS if HFR starts gaining traction. and not having 'double the amount of graphical niceties' hasn't stopped people from praising COD's graphics and nor has it stopped the game from flying off the shelf.

I agree with you that 30 fps isn't going away, but more people are going to turn on it ;)
 
Again, going from 60 to 30 FPS means you can more than double the amount of graphical niceties you can put onscreen at once.

There are always going to be MAJOR compromises made to achieve constant 60 FPS, and the developers just don't agree that they're worth it.

Remember Last Gen:

30fps

914828_20041116_screen006.jpg


60fps

928401_20050518_screen001.jpg


Double the framerate. Double the graphics fidelity.
 
Has nothing to do with the leap in power with the next Xbox/PS. Devs will focus on 30fps, so they can push more in terms of overall graphics.

That does have to do with the leap of power, if they aren't able to focus on both fps and increasing "overall graphics" as you put it, then there obviously isn't enough power. But you could have the power of the top of the line PC in a console form right now, and they'd find a way to make a game look incredible but it would still run at 30fps on a console. Not because it can't run at 60, but because fidelity is their only priority. They don't care about framerate because framerate doesn't matter on the back of the box or the youtubes.



And what the hell is this garbage?

It was a joke? Sensitive much?
 
They super sampled on those systems?

Of course. Bullshots.

How MGS3 really looks:

914828_20041118_screen011.jpg


How Ninja Gaiden really looks:

928401_20050805_screen012.jpg


928401_20050805_screen009.jpg


MGS3 on PS2 was sub 480i @ 30fps

Ninja Gaiden Black was 480p with 16:9 @ 60fps.

Even without factoring in bullshots... Ninja Gaiden still came out on top visually.
 
When ppl where flaming me for doubting a "10x power increase" for next gen in the PS3 rumor thread. . . they've gone real quiet real quick.

Guess arbitrary numbers (flops or whatever) don't actually mean anything when talking about the end result.
 
Expected but disappointing. Of course they will use the muscle to catch people eyes rather than please the few who notice or know the difference between framerates.
 
Of course. Bullshots.

How MGS3 really looks:

914828_20041118_screen011.jpg


How Ninja Gaiden really looks:

928401_20050805_screen012.jpg



MGS3 on PS2 was sub 480i @ 30fps

Ninja Gaiden Black was 480p with 16:9 @ 60fps.

Even without factoring in bullshots... Ninja Gaiden still came out on top visually.

But the Xbox was a more powerful console than PS2 though...
 
[snip]

Even without factoring in bullshots... Ninja Gaiden still came out on top visually.
You're seriously comparing games on two completely different systems as proof that devs can do better graphics with 60 FPS?

Besides the fact that it is a ludicrous comparison (no one's surprised at all by the fact that the Xbox is more powerful than the PS2), it's missing the point entirely - Ninja Gaiden could still have looked better running at 30 FPS rather than 60. That's why devs choose it.
 
Of course. Bullshots.

How MGS3 really looks:

*snip*

How Ninja Gaiden really looks:

*snip*

MGS3 on PS2 was sub 480i @ 30fps

Ninja Gaiden Black was 480p with 16:9 @ 60fps.

Even without factoring in bullshots... Ninja Gaiden still came out on top visually.

you've just reminded me of how fucking terrible Gamespot's screenshot captures used to be. neither game had terrible gamma and jpeg compression artifacts ;)

MGS3 did not have image quality nearly that bad. thank god we've left those days behind mostly. Not that it changes your point though.
 
MGS3 on PS2 was sub 480i @ 30fps
To be fair, MGS3 actually used a full framebuffer (as did MGS2) and could support 480p output with the right add-on. The game itself simply didn't offer 480p support even though it could have supported it.
 
People expecting under 720p for next gen are insane. Every generation has had a resolution bump similar to the jump from 720p to 1080p, last gen's was obviously bigger, so that increase won't be a huge burden. Sure, there are always games that run under the native/target resolution but very rarely less than half of that and certainly not in a well regarded title.
 
you've just reminded me of how fucking terrible Gamespot's screenshot captures used to be. neither game had terrible gamma and jpeg compression artifacts ;)

MGS3 did not have image quality nearly that bad. thank god we've left those days behind mostly. Not that it changes your point though.

Actually on a SD LCD... MGS3 looks worse than that.

To be fair, MGS3 actually used a full framebuffer (as did MGS2) and could support 480p output with the right add-on. The game itself simply didn't offer 480p support even though it could have supported it.

I think Silent Hill 3 used a full framebuffer as well. Great game hurt by IQ.
 
COD was 60fps last gen though.

Some games shouldn't be 30fps. I remember the negative reaction of Madden going 30fps. Notice how that was quickly changed.

CoD was 60fps on the ps2, GC, and xbox? I thought the frame rate was kind of rough when I played CoD: Finest Hour IIRC.

WiiU included, but I mean as a whole, it seems that the gap in power between console generations seems to be shrinking or stagnating. Mainly this is in regard to the 30/60fps argument going on here. I'm just disappointed that we're still talking about 30/60fps for "next gen" consoles. To me it should have been a natural progression to higher frame rates and visual fidelity. It just seems flat out lazy that we're sacrificing one for the other at this point, since when you move on to the next generation you should be making leaps not babysteps. It guess it's just something that console-only folks don't care that much about, which is why they're not people.

There's a good chance we'll see a good leap to next gen. However no matter the leap, there has always been the sacrifice between frame rate and fidelity. The jump from the PSone to PS2 was huge, but a game running at 30fps would still look better than the same game running at 60fps. It's the unfortunate reality when working within a closed box with finite amount of power. Now that doesn't mean we won't see a huge leap in 30fps games this gen and 30fps games next gen, 60fps last gen vs 60fps next gen, or 30fps last gen vs. 60fps next gen. Assuming CoD still tries to hit 60fps next gen, it will look better than Halo 4 this gen.

So actually you'll get what you want, higher fidelity at higher frame rates.

Fear was just one I could pull benchs from 2005 running at 720p+ at 60fps with 4xAA, a standard that still isn't met by consoles today. My point was just that it seems the bar on these next gen systems seems low as far as performance goes. I see your point though.

As dark10 already pointed out, I don't recall PCs easily running FEAR at 720p60 with 4xAA in 2005. Besides, the typical current gen shooter is doing a lot more than FEAR was in 2005, so it's unrealistic to expect current gen shooters to reach 720p60 with 4xAA.

I get your point about being limited by hardware, heat, cost, etc. I guess if you could get 1080 or 720p @ 60fps on a console there wouldn't be much reason for a pc. But the hardware gap has to be catching up rather than getting further away. Which is why I can't figure out why the framerate issue hasn't been addressed in 2+ generations.

It doesn't matter how large or small the gap is between consoles and PCs, the frame rate issue will always rear it's ugly head on both platforms, it's just on the PC you can remove these issues with more investment in faster hardware. On the console, there will always be a sacrifice of frame rate to create prettier pixels. We just have to hope developers don't go too low. No matter the amount of power given, even if it matched the highest PC specs today, we would still see 30fps games.
 
CoD was 60fps on the ps2, GC, and xbox? I thought the frame rate was kind of rough when I played CoD: Finest Hour IIRC.



There's a good chance we'll see a good leap to next gen. However no matter the leap, there has always been the sacrifice between frame rate and fidelity. The jump from the PSone to PS2 was huge, but a game running at 30fps would still look better than the same game running at 60fps. It's the unfortunate reality when working within a closed box with finite amount of power. Now that doesn't mean we won't see a huge leap in 30fps games this gen and 30fps games next gen, 60fps last gen vs 60fps next gen, or 30fps last gen vs. 60fps next gen. Assuming CoD still tries to hit 60fps next gen, it will look better than Halo 4 this gen.

So actually you'll get what you want, higher fidelity at higher frame rates.



As dark10 already pointed out, I don't recall PCs easily running FEAR at 720p60 with 4xAA in 2005. Besides, the typical current gen shooter is doing a lot more than FEAR was in 2005, so it's unrealistic to expect current gen shooters to reach 720p60 with 4xAA.



It doesn't matter how large or small the gap is between consoles and PCs, the frame rate issue will always rear it's ugly head on both platforms, it's just on the PC you can remove these issues with more investment in faster hardware. On the console, there will always be a sacrifice of frame rate to create prettier pixels. We just have to hope developers don't go too low.
O_O wow, very low expectation. I mean, I'm not sure about 60 fps, but I expect 1080p will be like the 720p of the current generation, no?
 
Of course. Bullshots.

How MGS3 really looks:
914828_20041118_screen011.jpg


How Ninja Gaiden really looks:
928401_20050805_screen009.jpg


MGS3 on PS2 was sub 480i @ 30fps
Ninja Gaiden Black was 480p with 16:9 @ 60fps.

Even without factoring in bullshots... Ninja Gaiden still came out on top visually.

XBox was a beast last gen though, compared to PS2 anyway. Not a good comparison.
 
One concern for next gen that AI have, is how recent console games have been really struggling on the framerate side. A decent bump in performance will be absorbed just getting current engines running at a reasonable framerate.
 
ps2 had better fillrate? Some exclusive could run badly on xbox, imho.

For transparency based games. But on sheer power and the use of modern shaders the Xbox won easily.

PS2

Tekken 5

920588_20050517_screen018.jpg


Soul Calibur 3

927089_20051025_screen002.jpg



Xbox Dead or Alive 3

doa3_screen005.jpg


doa3_screen004.jpg


And that was a launch title.
 
For transparency based games. But on sheer power and the use of modern shaders the Xbox won easily.
I know, but transparency are really relevant. It's means grass, vegetation, special effect, I think even in the polycount front; of course, xbox was more modern thanks to the modern shaders, but there were things where ps2 hardware beaten the xbox hardware strongness.
 
I remember playing Big Red One in 60fps on Xbox. I figured it was the same as the MOH titles on PS2.

Oh well that one I missed. I didn't own a last gen console for months by the time BRO was released.

O_O wow, very low expectation. I mean, I'm not sure about 60 fps, but I expect 1080p will be like the 720p of the current generation, no?

I'm not sure how the bolded line you quoted leads to low expectations next gen. I was saying that shooters on the PS360 typically do more than FEAR, so it's unrealistic to expect games like KZ or Halo to reach 720p60 with 4xAA.

I'm hesitant to assume 1080p will be standard next gen, similar to how 720p has been the standard this gen. I agree that we'll see more 1080p games or games using dynamic resolutions, but if a studio sees their game can look and run better at 720p with good filtering, that's the resolution they'll likely go with. There is nothing wrong with 720p as long as we have good AA and filtering.

ps2 had better fillrate, no? Some exclusive could run badly on xbox, imho.

IIRC one of the reasons for the higher fillrate in the ps2 was to support the multi-pass rendering employed on the system. By contrast the xbox was design to handle more in a single pass to help work around the bandwidth issues. I could be off, last gen feels like a life time ago for me. =p

Exclusives could run badly if not ported correctly as seen with MGS2 on the xbox. Though I'm not sure what they could have done with that game. =p

One concern for next gen that AI have, is how recent console games have been really struggling on the framerate side. A decent bump in performance will be absorbed just getting current engines running at a reasonable framerate.

Not necessarily. We see studios now preparing for next gen and that should help in the transition to next gen development. Anvil Next, Frostbite 2, Cryengine 3, Halo 4 engine, and more are all engines that have been quoted as the next step in preparing for next gen consoles.

I know, but transparency are really relevant. It's means grass, vegetation, special effect, I think even in the polycount front; of course, xbox was more modern thanks to the modern shaders, but there were things where ps2 hardware beaten the xbox hardware strongness.

Well yes, but there wasn't really enough power in last gen consoles to worry about grass/vegetation. Smoke, explosions, and other alpha transparencies likely impacted the xbox more than the ps2 though.

There were things the ps2 did better than the xbox, just like there were aspects of the GC that were better than the xbox and ps2.
 
When ppl where flaming me for doubting a "10x power increase" for next gen in the PS3 rumor thread. . . they've gone real quiet real quick.

Guess arbitrary numbers (flops or whatever) don't actually mean anything when talking about the end result.

PS4 could be 20x the PS3, doesn't mean devs are going to target 60fps. Like a broken record.
 
When ppl where flaming me for doubting a "10x power increase" for next gen in the PS3 rumor thread. . . they've gone real quiet real quick.

Guess arbitrary numbers (flops or whatever) don't actually mean anything when talking about the end result.

10x is very plausible. Moore's law has the same budget scaling a 2005 system to 2013 at 27x.

You forget what the performance increase goes into. Bumping to 1080p isn't free. High res textures isn't free. MSAA isn't free. 16x AF isn't free. Tessellation isn't free. Etc. 10x is my guess well because I bet the COGs will be lower and some of the hardware will be for streaming.
 
I expected this, visual fidelity to have the "wow" factor will continue to take priority over framerate.

Here is hoping that we get a steady 30 fps and 720p baseline resolution, no more of this "sub HD" stuff.
 
You mean 120 fps, comrade.
That's either crazy high end gaming, or putting something like the Falcom games or Quake 1-3 through your modern rig. And I half expect you could pull 240 on those (I'd really like to see 60/120/240 side by side. Would I still notice a difference by 240?)
 
I expected this, visual fidelity to have the "wow" factor will continue to take priority over framerate.

Here is hoping that we get a steady 30 fps and 720p baseline resolution, no more of this "sub HD" stuff.

Oh no please, really. 720p & steady 30 fps is simply ridicolous for next generation. I prefer sub 1080p, but really 720 is too low, what the hell we are talking about here. A jump in the resolution is the minimum.
 
My expectation isn't all games running in 1080p @ 60fps. But launch/early up-ports from this gen better be flaunting 1080p @ 60fps. I don't need the next Red Dead game to be 60fps at 1080p. But I do expect some gems like GT5 and WipeOut to push the barriers.
 
That's either crazy high end gaming, or putting something like the Falcom games or Quake 1-3 through your modern rig. And I half expect you could pull 240 on those (I'd really like to see 60/120/240 side by side. Would I still notice a difference by 240?)

Not really, vanilla Skyrim (HD textures, mesh mods etc) runs at over 120 fps for me, with high end ENB mods and such it drops down to 50-60.
 
That's either crazy high end gaming, or putting something like the Falcom games or Quake 1-3 through your modern rig. And I half expect you could pull 240 on those (I'd really like to see 60/120/240 side by side. Would I still notice a difference by 240?)

I already do this when possible. Sonic Generations at 120 fps was great.

Remember than a modern GPU is already beyond a generation greater than Xbox 360.
 
Not really, vanilla Skyrim (HD textures, mesh mods etc) runs at over 120 fps for me, with high end ENB mods and such it drops down to 50-60.

I already do this when possible. Sonic Generations at 120 fps was great.

Remember than a modern GPU is already beyond a generation greater than Xbox 360.
With me leaving V-Sync on (and some games like Max Payne 3 actually straining the card) I may be overestimating what games are capable of pushing.
 
who's PC? not mine, and not the majority of other PC users.
Seriously?

Even a GTX 560 can run any game released today at console-level settings and maintain a constant, unending 60 FPS.

Upgrade to something like a 670 and you have to start downsampling to even see the majority of games even dip below the 60 FPS threshold.
 
Top Bottom