• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

John Carmack: "Many next-gen games will still target 30 fps"

Sony and Microsoft should make 60 fps mandatory.


Look at the differences between Far Cry 3 on PS3 and PC:


ps3_0210fu8f.png

(direct feed, PS3)

far-cry-3-max-settin4wkjq.jpeg

(downsized from 1920x1080, max settings PC)


To average Joe, the differences - if he even sees any - are definitely not worth a new $500 console.

But one runs at 20 fps, the other at 60 fps.
 
John Carmack delivered the (shocking?) news via Twitter:



Those of you who crave 60 fps, PC gaming's got you covered.
I bet it won't right out of the gate...

I can guarantee that games designed with updated technology in mind won't be easy to max out at 60 fps on PC hardware available at the time.
 
Cool with me.

I've no problems with 30 fps.

30fps with all those fancy effects can really mask the fact its 30fps. Sure 60fps is lovely but I would rather have locked 30 with all eye-candy on rather than 60 with having to reduce stuff down.

Would be nice to have next-gen give you the options though.
 
People are seriously underestimating Call of Duty players if they think the players don't know the frame rate is a known benefit for the series. There are countless articles about it, as well, the developers constantly defend the use of their archaic engine by touting 60 frames per second possibilities.
 
You don't think PC will be able to do 60 fps on the first wave of next-gen games with tweaking?

My guess is that he means that nowadays simply doing some tweaking won't guarantee 60 fps , not due to system reqs per se but rather optimization.

In any case, unless the ports are broken / bad like some recent multiplatform games I'd be shocked if a modern high end pc can't hold 60 with moderate settings and low resolution.
 
Sony and Microsoft should make 60 fps mandatory.


Look at the differences between Far Cry 3 on PS3 and PC:


http://images.eurogamer.net/2012/articles//a/1/5/3/2/4/6/5/PS3_021.png[IMG]
(direct feed, PS3)

[IMG]http://www.abload.de/img/far-cry-3-max-settin4wkjq.jpeg[IMG]
(downsized from 1920x1080, max settings PC)


To average Joe, the differences - if he even sees any - are definitely not worth a new $500 console.

But one runs at 20 fps, the other at 60 fps.[/QUOTE]
And what's your point? If the console version ran at 60 fps the differences between the console and PC versions would be much bigger.
 
You don't think PC will be able to do 60 fps on the first wave of next-gen games with tweaking?
I hope you're right.

I still run into problems with ports of existing console games and a lot of developers fuck up the monitor timing resulting in serious hitches when the framerate cannot be maintained.

Need for Speed Most Wanted gave me problems recently. When the framerate drops to 57 or 58 fps it looks so much worse than it should.

Far Cry 3 holds 60 fps now but it required a lot of tweaking and I'm stuck being unable to enjoy the postfx settings

Hitman Absolution has points where it drops even with the settings in the basement.

Older games such as Metro 2033 still struggle to hold a solid framerate.

I don't yet know WHAT we will see next gen when engine requirements become much higher.

Maybe my PC just sucks. I swapped my GTX580 for a GTX680 recently and I'm using an OCed i5-3570k@4.5 in addition to 16gb of memory and multiple SSDs for gaming. I don't know what the fuck else I can do.

Shit like Most Wanted frustrates the hell out of me as it's so close to delivering what I want but even with EVERY setting at minimum and a 1280x720 resolution I STILL can't fully eliminate slowdown (that damn industrial district is an issue). Though, really, lowering most settings really doesn't seem to have much of a performance impact (outside of geometry). I could blame each of these games for being unoptimized, but these are games I want to play and this kind of thing has happened way more often this year than ever before.
 
I hope you're right.

I still run into problems with ports of existing console games and a lot of developers fuck up the monitor timing resulting in serious hitches when the framerate cannot be maintained.

Need for Speed Most Wanted gave me problems recently. When the framerate drops to 57 or 58 fps it looks so much worse than it should.

Far Cry 3 holds 60 fps now but it required a lot of tweaking and I'm stuck being unable to enjoy the postfx settings

Hitman Absolution has points where it drops even with the settings in the basement.

Older games such as Metro 2033 still struggle to hold a solid framerate.

I don't yet know WHAT we will see next gen when engine requirements become much higher.

Maybe my PC just sucks. I swapped my GTX580 for a GTX680 recently and I'm using an OCed i5-3570k@4.5 in addition to 16gb of memory and multiple SSDs for gaming. I don't know what the fuck else I can do.

Shit like Most Wanted frustrates the hell out of me as it's so close to delivering what I want but even with EVERY setting at minimum and a 1280x720 resolution I STILL can't fully eliminate slowdown (that damn industrial district is an issue). Though, really, lowering most settings really doesn't seem to have much of a performance impact (outside of geometry). I could blame each of these games for being unoptimized, but these are games I want to play and this kind of thing has happened way more often this year than ever before.
Isn't that Most Wanted bug getting looked at by the devs?

Far Cry 3 just follows what I said, if you value it you can get 60.

And couldn't Metro 2033 hit a locked 60 by removing some of the high-end poorly optimized options + a lower res if the user valued 60 fps over the top-tier settings?

All I know is that I'm not sitting on my current PC build when next-gen hits, I'm going to do a mobo + proc + ram + videocard refresh and I do expect to be able to continue to get 60 fps because I am willing to spend time tweaking games.
 
It makes the most logical sense. Consoles should be for getting the most possible out of the games while hitting 720 and 30fps as much as they can so I can play them on PC and get 60 while still getting a beastly looking game.
 
ya and the premise behind it is pretty dope.

Honestly, id tech 5 for all the talk required SSD to neutralise the incessant texture loading issue and there were plenty of low resolution textures even on the highest settings and the sky was static. Point is, I felt it was blown up to be much bigger deal that the end results. And are they developing one engine per game? It's a crying shame that id can't license their engine and yet most of other games under the parent company don't share their engine.

Who knows, may be when Fallout and ES become technical prehistoric fossils (because they already are dinosaurs) they'll reconsider replacing their beloved Gamebryo with something that belongs in the modern times with id's help.
 
Isn't that Most Wanted bug getting looked at by the devs?

Far Cry 3 just follows what I said, if you value it you can get 60.

And couldn't Metro 2033 hit a locked 60 by removing some of the high-end poorly optimized options + a lower res if the user valued 60 fps over the top-tier settings?

All I know is that I'm not sitting on my current PC build when next-gen hits, I'm going to do a mobo + proc + ram + videocard refresh and I do expect to be able to continue to get 60 fps because I am willing to spend time tweaking games.
Couldn't pull it off with Metro, unfortunately (DX9 + lowest setting still results in weird lurching performance dips at points).

Those were just examples, but I've been encountering it far too often as of late. I always see people talking about playing PC games with SGSSAA and insanely high resolutions like it were nothing.

Of course, as I've said before, I also know that some folks are much less picky about performance as I've personally experienced setups where the user was amazed by the fluidity when, in reality, the game was dropping frames left and right. So I'm really left wondering how smooth the average GAF users PC experience really is. There are definitely some picky users here, but I don't think that applies to the majority.
 
Do you remember how I got berated for claiming that by the end of the next gen we most likely will going back to 720p@30fps for "maximum prettiness"?

Personally, I think we'll see a lot of games with dynamic resolution. As for fps, if it's locked at 30fps, then I am okay with it.

Sorry I don't remember you being berated for claiming that. I have said the same plenty of times.

I also agree with you that we'll see plenty of games using dynamic frame buffers.

Honestly, it's not inherently untrue either. PS1/N64/Saturn games generally had an awful framerate, DC/PS2/GC/Xbox jumped way up, and I think part of why this generation stayed the same or started dipping was because of the resolution jump. Next generation, well, I guess it depends on how much more taxing 1080p is.

Yeah it's true that it jumped up last gen, especially with the average DC game and early PS2 games. However it didn't take long for games to start focusing on graphics at the expense of the frame rate. Racers, fighters, and action games remained 60fps in many cases, but we also saw games like Fable and SoTC where we saw an obvious impact to the frame rate.

I think it's typical for games to have better than average frame rates early in the generation, but when developers are looking to cut corners to gain more performance, resolution and frame rate are the two things to be cut first. This is why I expect most 1080p and/or 60fps titles early next gen and having one or both reduced as the generation goes on to render prettier pixels.

yep. anyone with functioning eyes can see that there's a difference, whether they care or not.

I think many can see the difference when it's presented to them, but when looking at a stand alone game, I doubt many could tell if it's 60fps or 30fps.

Sony and Microsoft should make 60 fps mandatory.

This would be an absolutely horrible idea and would hinder any company that tries to enforce this rule.

People are seriously underestimating Call of Duty players if they think the players don't know the frame rate is a known benefit for the series. There are countless articles about it, as well, the developers constantly defend the use of their archaic engine by touting 60 frames per second possibilities.

IMO most CoD players know how CoD "feels" but I doubt many directly relate that to the frame rate.
 
Personally, I think we'll see a lot of games with dynamic resolution. As for fps, if it's locked at 30fps, then I am okay with it.
THAT would be fucking awesome.

I really wish this were an option on the PC as well, to be honest. I love the idea of only dropping resolution when a scene becomes too demanding.

There are loads of PC games that run perfectly most of the time but manage to drop in specific situations where this could be a benefit. Would be especially nice when coupled with more aggressive AA. Allow us to use higher quality AA until performance becomes an issue and then drop it temporarily.
 
Couldn't pull it off with Metro, unfortunately (DX9 + lowest setting still results in weird lurching performance dips at points).

Those were just examples, but I've been encountering it far too often as of late. I always see people talking about playing PC games with SGSSAA and insanely high resolutions like it were nothing.

Of course, as I've said before, I also know that some folks are much less picky about performance as I've personally experienced setups where the user was amazed by the fluidity when, in reality, the game was dropping frames left and right. So I'm really left wondering how smooth the average GAF users PC experience really is. There are definitely some picky users here, but I don't think that applies to the majority.

honestly, I'm going to hold off upgrading my PC until probably 2014 sometime at this point, and pick up one or both new consoles next year instead to enjoy games 'as is'. until I can build a PC to get me what I want (reliable 60 fps) it's not worth me upgrading, because I think you're right.

I'll spend hours tweaking and adjusting to try and achieve fluid framerates on PC and that isn't even always currently possible. until I resign myself to that reality (or achieve a locked framerate) I can't start enjoying the game. on consoles, I except the game for what it is (so long as it doesn't drop much below 30 fps) and just enjoy it right out of the gate.

I'm going to limp along on my current hardware until the time I can build a PC that can comfortably maintain 60 fps on multiplats for a few years. that's basically what I did this gen too, and it worked out pretty nicely.
 
honestly, I'm going to hold off upgrading my PC until probably 2014 sometime at this point, and pick up one or both new consoles next year instead to enjoy games 'as is'. until I can build a PC to get me what I want (reliable 60 fps) it's not worth me upgrading, because I think you're right.

I'll spend hours tweaking and adjusting to try and achieve fluid framerates on PC and that isn't even always currently possible. until I resign myself to that reality (or achieve a locked framerate) I can't start enjoying the game. on consoles, I except the game for what it is (so long as it doesn't drop much below 30 fps) and just enjoy it right out of the gate.

I'm going to limp along on my current hardware until the time I can build a PC that can comfortably maintain 60 fps on multiplats for a few years. that's basically what I did this gen too, and it worked out pretty nicely.
When a 30 fps lock works properly, I will use that at times, but it is not something you can count on. Usually a mix of MSI Afterburner + half refresh rate will do the trick, but with some games, it just doesn't work properly (Far Cry 3 being a recent example).

I'd always prefer 60 fps, but sometimes holding a steady 60 fps requires too great a sacrifice elsewhere and I'd prefer higher quality visuals at a stable 30 fps.

Crysis 2 DX11 was a good example of this as I was able to use a very high resolution, better AA, and all of the DX11 features maxed out at a rock solid 30 fps. For 60 fps, the DX11 stuff has to go. I love how easy it is to achieve both of these options in Crysis 2, though. Wish other games worked that well...
 
Sony and Microsoft should make 60 fps mandatory.


Look at the differences between Far Cry 3 on PS3 and PC:


PS3_021.png

(direct feed, PS3)

far-cry-3-max-settin4wkjq.jpeg

(downsized from 1920x1080, max settings PC)


To average Joe, the differences - if he even sees any - are definitely not worth a new $500 console.

But one runs at 20 fps, the other at 60 fps.
That PC screen shot (I can't see the console one) looks like a cartoony style game. I would have never thought this was as serious a game as Farcry if not mentioned by name. People really prefer these bright colors for gritty war themed games?
 
Of course, as I've said before, I also know that some folks are much less picky about performance as I've personally experienced setups where the user was amazed by the fluidity when, in reality, the game was dropping frames left and right. So I'm really left wondering how smooth the average GAF users PC experience really is. There are definitely some picky users here, but I don't think that applies to the majority.
The vast majority of people accept a framerate drop here or there because there is only so much you can account for and there will always be situations that exceed that (tons of explosions on screen, etc). The same logic applies to console performance and how tolerant people are of drops below 30.
 
People are acting like this is news? Every gen there is an assumption that 60fps will become the defacto standard on consoles and it just won't ever happen. I'd love it to be true but when given limited power 30fps tends to be a better target from a developer's perspective. I do have some hope that 60fps will be more common than this gen though, hopefully similar to last gen. Still, if 30 is going to be common it better be a solid 30 and have some nice motion blur to smooth out the motion.
 
When a 30 fps lock works properly, I will use that at times, but it is not something you can count on. Usually a mix of MSI Afterburner + half refresh rate will do the trick, but with some games, it just doesn't work properly (Far Cry 3 being a recent example).

I'd always prefer 60 fps, but sometimes holding a steady 60 fps requires too great a sacrifice elsewhere and I'd prefer higher quality visuals at a stable 30 fps.

Crysis 2 DX11 was a good example of this as I was able to use a very high resolution, better AA, and all of the DX11 features maxed out at a rock solid 30 fps. For 60 fps, the DX11 stuff has to go. I love how easy it is to achieve both of these options in Crysis 2, though. Wish other games worked that well...

as we've talked about before, FPS games are one of the few games were a bit of judder doesn't drive me insane. I played C2 with all the DX11 shinies and averaged something like 58 fps and loved it without ever being distracted by framerate fluctuation. Yet, I play Rayman Legends and a single drop to 58 at the end of one level is massively distracting to me. Platformers, racing games, even something like Batman Arkham City or Dead Space, I can't fail to notice when I lose 60 fps... for some reason FPS games are an exception.

Hopefully with cinema pushing higher framerates game designers will as well, but we're a few years away from HFR approaching widespread acceptance.
 
That PC screen shot (I can't see the console one) looks like a cartoony style game. I would have never thought this was as serious a game as Farcry if not mentioned by name. People really prefer these bright colors for gritty war themed games?

most of the FarCry series have brightly coloured environments. Two was the only exception. One and all it's spin offs were very colourful. It's a really nice break from the drab experience of most other shooters on the market.
 
That PC screen shot (I can't see the console one) looks like a cartoony style game. I would have never thought this was as serious a game as Farcry if not mentioned by name. People really prefer these bright colors for gritty war themed games?

Far Cry 3's characters are pretty cartoony.
 
The vast majority of people accept a framerate drop here or there because there is only so much you can account for and there will always be situations that exceed that (tons of explosions on screen, etc). The same logic applies to console performance and how tolerant people are of drops below 30.

Just look at any good benchmark comparison on PC hardware sites - no matter how powerful the hardware is, you will still very frequently see new games' minimum framerates dropping below 60fps. You usually have to go like 2 generations beyond the current best to have rock solid unfaltering 60fps (and in that case, the averages will be about 90-100).
 
I would rather have 60 fps.

On the other hand, if a locked 30 FPS can afford more performance in other areas, I am all for that.
 
I still run into problems with ports of existing console games and a lot of developers fuck up the monitor timing resulting in serious hitches when the framerate cannot be maintained.
A lot of current games' DX11 features are bolted onto code based on earlier console tech. Personally I expect DX11 games that are built from the ground up to provide better performance.
 
Untill hardware outclasses software needs by a significant amount, this will always be the case.

PC gamers benefit from this, so I can't complain too much.
 
Top Bottom