• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Why does 1080p, 60 Fps seem hard to meet?

Well, they ARE porting a game that probably used every trick Naughty Dog had in their sleeves to make the game as impressive as it is on PS3. Getting that kind of stuff running on a completely different kind of hardware, even if more powerful, isn't an easy task. If Last of Us had been made for the PS4 from the start and they aimed at the level of graphics of Last of Us PS3 (excluding IQ and maybe some improvements to the things they are improving), then they'd probably have an easy/easier time of achieving 1080p/60fps.
 
Every game wants to look the best. Sometimes that's not possible at 1080/60. For TLoU, there'd be a shit storm if it looked like the ps3 one, regardless of what resolution or fps it ran at.
 
Most of the same people that complain about targeting 30fps would complain that the graphics aren't impressive if the game is 60fps. Why not ask for 120fps while you are at it?
 
This is exactly why it is such a bad idea to buy a console.


Baffling. You're concerned enough about a game rendering at the specs of your television that you would make sacrifices for 1080p, yet you don't take into consideration the far more important native refresh rate of your television, the spec that actually matters when it comes to video games.

Native refresh rate being more important is your opinion and is not shared by everyone. I would much prefer native res over native refresh. Having both is ideal but not always necessary.
 
why do these kinds of threads keep popping up? Its not that hardware is not powerful enough, it really doesnt matter how powerful hardware is, developers would just prefer to make the graphics prettier than target 60fps. They could make Uncharted PS4 1080P 60fps but it would undoubtedly look better at 30fps because the higher it is the more demanding it will be on resources. Its practically as simple as math.
 
Reviewers can't tell the difference between 30 fps and 60 fps and don't care. If they started docking points for frame rates and low res in major magazines, every major AAA game would start hitting 1080p and 60fps within one or two games in the series.
 
Not to worry, by the time we start getting 1080p at 60fps more consistently companies will be thrusting 4k upon us like there's no tomorrow. Gotta keep chasing that dragon...
 
Native refresh rate being more important is your opinion and is not shared by everyone. I would much prefer native res over native refresh. Having both is ideal but not always necessary.
That is because you don't have a magical computer that run all games at 1080p/60fps for 8 years and only costs $500, like most PC advocates do.
 
In terms of TLoU remake, they have to retool the engine that was heavily optimized to make the most of the PS3. It's a lot of work to go back and change all the rendering tricks specific to the PS3 and get it running elsewhere while maintaining the same feel.

Everything will improve on all fronts after a few years of engines being on the consoles and recieving a ton of documentation from those working on it. It'll being new techniques and methods for ways to program specific functions to get the most out of it.
 
Consoles not powerful enough.

Within limits, it doesn't really matter how powerful a machine is. If a dev doesn't prioritise 60fps from the start it's not very likely to happen as budget/time/per-pixel ambition will probably expand to fill up frametime until you're left with a more modestly 'acceptable' rate like 30fps. No matter the machine power, if you can do x at 60fps, you'll always be able to do x*2 at 30fps, or achieve your target visual quality faster/more cheaply with a lower optimisation load at a lower framerate or whatever. Those elements will always tempt a dev away from 60fps if it isn't a critical factor for the developer regardless of the performance available.

To answer the OP's question, most devs don't start with a locked 60fps as a hard requirement.
 
Thankfully. I'll take better effects over extra frames in a game like TLoU.
.

Except I'd take better visuals in damn near everything over higher framerate. 30fps is perfectly fine for me. The people that want higher frames are in the minority, though that's not to say 60fps isn't better (it is), I just don't care enough because I don't think the difference between the two is dramatic.
 
.

Except I'd take better visuals in damn near everything over higher framerate. 30fps is perfectly fine for me. The people that want higher frames are in the minority, though that's not to say 60fps isn't better (it is), I just don't care enough because I don't think the difference between the two is dramatic.

This.
 
This is my personal opinion, but I think we're nearing the point where the graphics are good enough. Future improvements need to come from other places like AI, lighting, animations, etc... Much harder to improve.
 
If MGSV, an open world game, can pull off native 1080p and a rock solid 60fps while looking as good at it does, there is no reason not to achieve this standard. It could even be argued that achieving this standard is easier and less costly for the developer than to bump up the eye candy. One would assume that more developers would embrace this, what with all the doom and gloom about how expensive AAA game development is, and shit. And yet, we keep getting 30fps games.
 
Because 1080p60 isn't a fixed performance cost despite what way too many people seem to think.

This. Whether or not a game can run at that resolution and framerate on a particular piece of hardware depends on different factors that change from game to game.

60 frames per second has been a possibility on pretty much every game console ever (at least since the NES). It just depends on how the developer is willing to balance visual effects and frame rate.

And with porting old games to new consoles, I imagine some performance cost is eaten up by getting the game to run differently from how it was originally designed. This could especially be the case for PlayStation games which either used a lot of tricks unique to the PS2 (at the time) or were designed to specifically utilize the PS3's cell processor in certain ways.

If MGSV, an open world game, can pull off native 1080p and a rock solid 60fps while looking as good at it does, there is no reason not to achieve this standard. It could even be argued that achieving this standard is easier and less costly for the developer than to bump up the eye candy. One would assume that more developers would embrace this, what with all the doom and gloom about how expensive AAA game development is, and shit. And yet, we keep getting 30fps games.

MGS V, even Phantom Pain, is still being designed on a PS3/360 foundation. That fact alone holds it back significantly compared to games like Assassin's Creed Unity or Batman Arkham Knight in terms of the visual effects it will employ.
 
Baffling. You're concerned enough about a game rendering at the specs of your television that you would make sacrifices for 1080p, yet you don't take into consideration the far more important native refresh rate of your television, the spec that actually matters when it comes to video games.

And I am baffled that some people are incapable of understanding why someone might have a different opinion than theirs. But by all means, go on bashing people for buying consoles and claiming your opinions are superior.
 
You're talking about going from 27 648 000 pixels/seconds to 124 416 00 pixels/ seconds on a different architecture, using an engine that was optimized for the PS3. That's not trivial..
 
In a general sense, certain forms of post-processing effects or behind the scenes simulations can make a solid 60 fps unrealistic. Game engines also have their own limitations no matter what brute-force they have access to in hardware. It's the same reasons certain PC titles are regarded as better optimized than others. With the expectations behind modern game design, 1080p/60 fps isn't always a given, particularly when you're taking into account affordability for hardware.

Closer to your original point, even an older engine won't necessarily scale up in line with better hardware, partly because so many titles are already specially optimized for the original platform. All the logistics of that port are being compounded (in most cases) by the engine being asked to punch above its weight level with additional enhancements.
 
One reason: going from 30fps to 60fps does not require 2x the power. It's more like 2.5x-3x the processing power.

Why?

Every rendered frame has some automatic overhead applied to it regardless of resolution. AI processing, physics processing, and everything else. That is static.


Think about it in milliseconds-per-frame instead of frames-per-millisecond.

30FPS: You have 33.3 milliseconds to render each frame.
60FPS: You have 16.7 milliseconds to render each frame.

AI / Physics / etc. are all constant. Let me pull a number out of my ass and say all that stuff takes 7 milliseconds. So 7 milliseconds is gone from each frame regardless of resolution. At 60FPS, you only have 9.7 left for graphics rendering, and at 30FPS, you have 26.7 left.

With those numbers, you'd need to be able to render each frame at 60FPS in 36.3% the time as if you were instead going for 30FPS.
 
Gotta be honest, I feel like tomb raider negates a lot the comments here. "Hardware can't do it without sacrificing graphic quality" "consoles not powerful enough" ect. Tomb raider did it, with enhanced visuals, Tresswhateverthefack, and four months after the system came out. That group really exposed the potential of the ps4, setting the bar so to speak. There might be a million reasons why a dev doesn't do it, but don't piss on the system itself.It's more than capable for port upgrades. Hell, kojima said that not only is 60 frames possible, but there's even room to spare.
 
Because contrary to GAF thinks, gamers value visuals more than they value FPS.

I'm not so hung up on FPS. My pet peeve is when level design is sacrificed in order to improve image quality. Levels become more linear in order to limit the number of graphics that need to be shown. That seems to be a trend as a generation gets older. Games keep trying to one up each other and their prior versions by doing this and in my opinion it really hurts gameplay.
 
Still images are still used to sell games, and still images look better with more effects. Obviously frame rate doesn't matter in that case :)

Because contrary to GAF thinks, gamers value visuals more than they value FPS.

Can we end this "visuals" not being a part of framerate when it comes to discussion?
 
Because 1080p60 isn't a fixed performance cost despite what way too many people seem to think.

This. "1080p60" tells you the fill rate needed, but performance and power are so much more than that. Everything depends on the engine, the individual game, and how much time and effort the developers put in. Remember that there were 1080p60 games on Xbox 360 and PS3. Hell, Okami HD effectively runs at 3840x2160.

It would be great if more devs prioritized resolution and framerate highly enough to make whatever cuts were necessary to meet their targets, but that's not the world we live in unfortunately.
 
MGS V, even Phantom Pain, is still being designed on a PS3/360 foundation. That fact alone holds it back significantly compared to games like Assassin's Creed Unity or Batman Arkham Knight in terms of the visual effects it will employ.

While that statement is technically true, it overlooks the fact that the last-gen versions of MGSV are running at 30fps, sub-720p. A resolution that bad is, if anything, an indication that the game is too much for those consoles to handle.

Also, we haven't seen enough about Unity or Arkham Knight to judge whether they feel "more next-gen" than MGSV, be it in graphics or in the gameplay department.
 
Making games is hard.

Sometimes I think some GAFfers are under the impression that rendering at 1080p60 is just a matter of clicking a box on a screen somewhere and they can't understand why nobody's clicking the box.
 
Because hitting a consistent 60fps usually requires your code to perform more than twice as fast than it needs at 30fps, so it's a bigger effort for developers.
 
Gotta be honest, I feel like tomb raider negates a lot the comments here. "Hardware can't do it without sacrificing graphic quality" "consoles not powerful enough" ect. Tomb raider did it, with enhanced visuals, Tresswhateverthefack, and four months after the system came out. That group really exposed the potential of the ps4, setting the bar so to speak. There might be a million reasons why a dev doesn't do it, but don't piss on the system itself.It's more than capable for port upgrades. Hell, kojima said that not only is 60 frames possible, but there's even room to spare.

But the very fact the game was originally built to run on PS3 and Xbox 360 limits the scope of what Crystal Dynamics could originally do when planning out the graphics effects. Cross-gen games will always be much easier to run at 1080p60 than current-gen-only games. There's a difference between slapping new effects on top of a last-gen game and designing a game's graphics from the ground up for current-gen.

Just wait till later this year when we start seeing more games that were actually designed for PS4 and Xbox One (or modern PC hardware). They'll probably look another league beyond Tomb Raider Definitive Edition.
 
Why is 60fps always the magic number?

Non-NTSC regions (i.e. most of the the world) lived with 50fps for the majority of the history of videogames, almost all modern TVs can handle 50fps, yet somehow a game slightly less than 60 is unplayable?
 
Well, they ARE porting a game that probably used every trick Naughty Dog had in their sleeves to make the game as impressive as it is on PS3. Getting that kind of stuff running on a completely different kind of hardware, even if more powerful, isn't an easy task. If Last of Us had been made for the PS4 from the start and they aimed at the level of graphics of Last of Us PS3 (excluding IQ and maybe some improvements to the things they are improving), then they'd probably have an easy/easier time of achieving 1080p/60fps.
The SPU tricks and threading that ND perfected with the cell will put them in good sted on the ps4, they will have a small advantage with the 7 core cell code being useable on a more powerful CPU ( yes I know it is) in the ps4 and working the CPU hard to feed the gpu, bf4 on ps4 shows how unused the CPU is, ND should hit the ground fast from there engine and the core aspect of it working the CPU to within an inch of its life along with the gpu.

That said UC4 ( if shown at E3 ) will make this game look old if they just went 60 only, I think it will be a good level of 60 tomb raider or more.
 
If MGSV, an open world game, can pull off native 1080p and a rock solid 60fps while looking as good at it does, there is no reason not to achieve this standard. It could even be argued that achieving this standard is easier and less costly for the developer than to bump up the eye candy. One would assume that more developers would embrace this, what with all the doom and gloom about how expensive AAA game development is, and shit. And yet, we keep getting 30fps games.

MGS can keep 60fps because it looks like this
png

Personally I think it looks nice and plays great, but it doesn't exactly scream "nextgen!" to me. Certain textures don't look as bad, but overall it's not exactly a looker on PS4. Can't imagine the XB1 version upscaled to full HD *shudders*
 
And I am baffled that some people are incapable of understanding why someone might have a different opinion than theirs. But by all means, go on bashing people for buying consoles and claiming your opinions are superior.
Opinions aren't free from criticism.

Native refresh rate being more important is your opinion and is not shared by everyone. I would much prefer native res over native refresh. Having both is ideal but not always necessary.
Of course it's my opinion, however the fact that it is subjective doesn't prevent it from being discussed. I have no idea why you hold your stance though because the frequency of receiving new information is almost all the time far more important than the detail of the information in the video game world. Then again this is why I don't play console anymore, I don't like that decision being made for me.
 
TLOU does not look 'fine' by PS4 standards. Hell, I don't think even TR:DE looked 'fine' by standards set by KZ:SF and Infamous, and they updated TR a hell of a lot more than anyone expected. I think TLOU has to at least meet the upgrade standard set by Tomb Raider.
 
People who think this has anything to do with hardware performance don't understand what's going on here at all. The PS4 could have been twice as powerful, and we'd still mostly get 30 fps games. The extra power would go into making shinier graphics, not more frames. That's how it always is (with some exceptions), and that's how it always will be. This idea some people have that "the PS4 should be able to hit 1080p60 in all games", or that it would have if it was more powerful, is simply ridiculous, because that's not how that shit works.
 
One reason: going from 30fps to 60fps does not require 2x the power. It's more like 2.5x-3x the processing power.

Why?

Every rendered frame has some automatic overhead applied to it regardless of resolution. AI processing, physics processing, and everything else. That is static.


Think about it in milliseconds-per-frame instead of frames-per-millisecond.

30FPS: You have 33.3 milliseconds to render each frame.
60FPS: You have 16.7 milliseconds to render each frame.

AI / Physics / etc. are all constant. Let me pull a number out of my ass and say all that stuff takes 7 milliseconds. So 7 milliseconds is gone from each frame regardless of resolution. At 60FPS, you only have 9.7 left for graphics rendering, and at 30FPS, you have 26.7 left.

With those numbers, you'd need to be able to render each frame at 60FPS in 36.3% the time as if you were instead going for 30FPS.

This is a good answer -- it's kind of an incomplete one (deadlines are key, but in hardware concurrency things shift around a bit -- game subsystems have a thread, or multiple threads working together or separately to fight against the time constraints). Some information is deadline specific -- ie: I need the result of some computation at the end of this frame, or the beginning of the next. Other computations are not so stringent on frames -- ie: I will start a raycast in frame 1, and receive the result of that on frame 2.

Deadlines like this make things really hard to deal with, and some really hardcore optimizations sometimes need to take place to ensure you're meeting them (avoid cache misses, better algos, more memory usage to offset time it takes to acquire/release resources). It's hard -- and even when you have it all figured out, the "non-exactness" of the science might mean doing some things at runtime to make up for an unforeseen problem. Id Tech 5, for example, adapts to changes in deadlines by lowering resolution on the fly. Great technology, and not everyone has it. I doubt that people working outside of their own tech (say, licensees of UE3) can do a lot of these kinds of optimizations. For them, it's probably about playing the game and seeing what happens to the framerate when they add a new character in (as a matter of fact, this kind of attention to detail of how the game is changing as you make it was an issue for RAGE as well).

So, yeah, short answer is it's hard. The question is -- why is it not worth it? Probably because there are very real concessions you may have to make to achieve it -- if it's really necessary that a scene have 100 characters on the screen, then you can either get smart about how you want to achieve that in your budget, or you simply "not" hit 60 at all times. I personally don't like that; much prefer fixed 30 over variable "up to 60."

That said, I hate it that it's not a focus of devs sometimes. I feel like, if you can't tell your story unless you impact performance, you have a problem, and you should be doing that hard work to make up for it. If, for some reason, adding characters in a movie reduced framerate, do you suppose that directors would be okay with that? I doubt it.
 
Tbh, I would want a game designer to enter the thread and give that question a real answer because it's going to come down to people making guesses or trying to put two and two together to answer why. I really want to know the nitty gritty even if I don't understand the terms of which they will explain.

All I know is it has something to do with Software and Hardware lol.
 
While that statement is technically true, it overlooks the fact that the last-gen versions of MGSV are running at 30fps, sub-720p. A resolution that bad is, if anything, an indication that the game is too much for those consoles to handle.

Also, we haven't seen enough about Unity or Arkham Knight to judge whether they feel "more next-gen" than MGSV, be it in graphics or in the gameplay department.

See my response to the Tomb Raider post...

MGS can keep 60fps because it looks like this*

...and this post as well.

A ton of games on PS3 and 360 struggled to hit 30fps at sub-720p resolution. They were still designed to be able to run on those systems from the beginning. Kojima has admitted at first MGS V was designed as a PS3 and 360 game and only more recently has put the PS4 version in the spotlight. Ultimately, almost all cross-gen games, if not all, are really just last-gen games with enhanced next-gen versions.

Actual PS4 and Xbox One games will pretty soon start looking at least an order of magnitude better than the likes of Tomb Raider, MGS V, and The Last of Us Remastered. I'll also say the same for Assassin's Creed Unity. The retail game might not have the same image quality as the initial reveal trailer, but the final product probably will look an absolute shitload better than Assassin's Creed IV, simply because it's development started off with much more headroom.
 
Top Bottom