• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Uncharted 4 multiplayer will run at 900p / 60fps

Get outta here with that shit. It's not "objectively" better it's subjectively better. I do not get more immersed with 60 FPS. It works differently for me; the same way one gets immersed in a action movie. Stop speaking for me.

What? Of course 60fps is objectively better than 30fps, just as 1080p is objectively better than 900p.
 
Lol you late! But seriously it's not so much UC4, it's the question of how weak the PS4 is really for me. I knew it wasn't a powerhouse but damn! At least the bare minimum of 1080p! It's ridiculous to be under that for 10 years.

Most big budget and graphic intense games that runs at 60FPS have to make sacrifices like Battlefront and so on. It's not that the console can't run it at 1080p and 60FPS, it lies with the devs prioritizing better graphics than 1080p resolution.

Because most people don't see the big difference when it comes to 1080p and 900p, but they do notice graphics differences and see/feel the more fluid framerate.
 
Whatever dude. DC runs with real-time GI and UC4 doesn't. Pros and cons. I'm sure they could maintain the resolution if they had more time

This is brought up all the time, but we really have no idea what it is and how it functions. It would be nice if we actually knew.
 
If Dice and ND both have to resort to 900p you can be sure it is the console, not the devs.

It's not the console or the devs. It's a mix of both.
The hardware of the console has nothing to do with the resolution or framerate of the games. With consoles you are working with a limited hardware and how to use it is up to the dev. Do you go for the best possible graphics at the cost of framerate?, Do you want to push 60fps and 1080p with big cuts to the presentation? Do you think 60fps are important but you still want to have decent graphics and so you decide to lower the resolution to 900p? Do you want 40 players online on big maps with unlocked framerate or mabye just set it at teams of 10 but lock it at 60fps?.

So yeah, on one side you have consoles that can't push 1080p and 60fps while mantaining the original vision of the devs. But you also have devs that would rather push graphics over performance or resolution. This concept of "lol these consoles can't do 1080p/60fps" is stupid. Yes they can, they can probably also do 4K and 60fps, it all comes down to the game and what sacrifices the devs are willing to make to reach those numbers.
You will NEVER again get a console that can run games at 1080p/60fps without making sacrifices (unless we move on to streaming games or something like cloud processing). These things have to be sold at low prices if they want to have mass appeal. You are never going to get top of the line performance when the recommended GPU to run something like Battlefront costs about as much as a whole console.

So it's hard to me to understand some people on here. A game pushes for 60fps and 1080p: "lol looks like last gen", a game pushes for 60pfs while sacrificing resolution to still mantain decent graphics: "already down to 900p, these consoles suck and I won't buy the game", a 1080p game goes for 30fps in order to push the graphics as far as possible: "30 fps is like a slideshow, devs should allways focus on 60fps".
Seriously. Want the best possible graphics with the best possible performance?: Go build a high end gaming PC. You won't find what you are looking for on a $350 console
 
I don't understand some people. They seem to be living in an alternate world of delusion where the PS4 hardware is equal to a high-end PC.

If they've toned down the resolution it's because they couldn't lock 60fps with 1080p, which is a perfectly fine compromise for 60fps, considering framerate will always be better than resolution for everyone except two geniuses that would play the multiplayer for 1 hour, especially in multiplayer.

30fps multiplayer with drops is fucking unplayable. Is it that hard to understand that when games like this or Battlefield drop the resolution to achieve a better framerate it is not the developers being "lazy" or incompetent but them doing what they can with the shitty hardware to make the thing playable?

Some people are embarrassing, jesus christ. 1080p / 30fps for multiplayer would be absolutely worse than this solution, stop crying already. And everyone who plays a decent multiplayer game knows this, so ironically the biggest pool of complainers probably comes from people who play three matches and forget the multiplayer forever.
 
I don't understand some people. They seem to be living in an alternate world of delusion where the PS4 hardware is equal to a high-end PC.

If they've toned down the resolution it's because they couldn't lock 60fps with 1080p, which is a perfectly fine compromise for 60fps, considering framerate will always be better than resolution for everyone except two geniuses that would play the multiplayer for 1 hour, especially in multiplayer.

30fps multiplayer with drops is fucking unplayable. Is it that hard to understand that when games like this or Battlefield drop the resolution to achieve a better framerate it is not the developers being "lazy" or incompetent but them doing what they can with the shitty hardware to make the thing playable?

Some people are embarrassing, jesus christ. 1080p / 30fps for multiplayer would be absolutely worse than this solution, stop crying already. And everyone who plays a decent multiplayer game knows this, so ironically the biggest pool of complainers probably comes from people who play three matches and forget the multiplayer forever.

Pretty much.

I really don't think any sane person that plays multiplayer games would take 1080p over 60FPS in this case. Any good multiplayer title with a semblance of longevity at this point needs 60FPS, it's the standard now even on consoles. It's like people forget that 1080p won't matter for shit compared to more fluid framerate and more responsive controls in a fast-paced multiplayer shooter.

People don't play MP games for it's graphics, players will get used to it in no time and so on. Gameplay and responsive controls is what keeps people playing MP, hence 60FPS over resolution. I applaud ND for doing this really, we get the best of both worlds when it comes to the SP and MP.
 
People don't play MP games for it's graphics, players will get used to it in no time and so on. Gameplay and responsive controls is what keeps people playing MP, hence 60FPS over resolution. I applaud ND for doing this really, we get the best of both worlds when it comes to the SP and MP.
Not playing MP for it's graphics could also mean that all assets are set on medium-low to render at native resolution, yet at the same fluid 60fps.....that you love.
 
Not playing MP for it's graphics could also mean that all assets are on medium-low to deliver a native resolution at the same fluid 60fps.....that you love.

There could be things on the CPU side that makes it impossible to have the game run at 1080p and 60FPS with their vision of the game, functionally (gameplay-wise) and graphically, that we don't know of.

It's pretty reductive to think it's just as easy as to just "lower the models and textures" until it's possible to have 1080p60fps.
 
So it's hard to me to understand some people on here. A game pushes for 60fps and 1080p: "lol looks like last gen", a game pushes for 60pfs while sacrificing resolution to still mantain decent graphics: "already down to 900p, these consoles suck and I won't buy the game", a 1080p game goes for 30fps in order to push the graphics as far as possible: "30 fps is like a slideshow, devs should allways focus on 60fps".
Seriously. Want the best possible graphics with the best possible performance?: Go build a high end gaming PC. You won't find what you are looking for on a $350 console

LOL so true. No pleasing everyone I guess.

Also even if the PS4 was twice as powerful, games would still run at 30 FPS so devs can push graphics
 
If Dice and ND both have to resort to 900p you can be sure it is the console, not the devs.

Dice are good but not the best technically.

I can't believe ND have pushed the PS4 to it's limits with their first game. Just look at Uncharted 1 vs Uncharted 3.

Still l wish this was a game 1080p 60fps throughout.
 
Dice are good but not the best technically.

I can't believe ND have pushed the PS4 to it's limits with their first game. Just look at Uncharted 1 vs Uncharted 3.

They technically pushed it to the limits with UC2 if you watch their GDC talk as they took full advantage of SPUs unlike uc1. Uc3 improvements were not to do with pushing the console but finding more efficient ways to do things. Without some crazy architecture they can start pushing it from the get go but that doesn't mean there will not be great improvements with future games.
 
GAF, as much as you would love to, and as much as you say it, 30 fps is not unplayable.
But for safety, get your eyes checked.

You're welcome = )
 
They're just doing whatever people want. If everyone wanted 1080p over 60fps, it would be 1080p and 30fps. Over time the 60fps crowd has gotten to be the louder one so it's what we'll see in games. It was mistakenly thought that people wanted better looking games over better performing ones. I know they want both but, that isn't happening yet on these consoles.
 
Dice are good but not the best technically.

I can't believe ND have pushed the PS4 to it's limits with their first game. Just look at Uncharted 1 vs Uncharted 3.

Still l wish this was a game 1080p 60fps throughout.

Well...Uncharted 1 suffered from constant framerate issues and screen tearing, while Uncharted 2 held a steady 30fps along with much better graphics.

We'll see, but people probably shouldn't expect some sort of miraculous jump in quality with their next game.

I like how these threads really bring out the stupidity. "60fps isn't better than 30fps!" "You can't even see the difference!". Lol.

Fanboys, how do they work.
 
I'd rather see them cut down on something else than resolution, honestly. Graphics doesn't really matter in mp, but framerate and resolution does.
 
I'd rather see them cut down on something else than resolution, honestly. Graphics doesn't really matter in mp, but framerate and resolution does.

That's my take as well. Given that you can't have pretty and high performance both, I'd rather they just make it not as splashy to get 1080p/60fps. The amount you can view is up there with input sensitivity in importance in multiplayer.
 
LOL so true. No pleasing everyone I guess.

Also even if the PS4 was twice as powerful, games would still run at 30 FPS so devs can push graphics

this is a fair point... on consoles there will always be 30fps games that make 60fps games look bad so some publishers will want their game to look better than other games thus forcing 30fps

i respect any dev that targets 60fps solid on consoles.

only way to counter this is to not accept 30fps games on consoles anymore
 
GAF, as much as you would love to, and as much as you say it, 30 fps is not unplayable.
But for safety, get your eyes checked.

You're welcome = )

Well it is certainly playable. For casuals who drop the game in one week, that is. Try to play CS, Quake or any Moba with 30fps.

I'm fine with locked 30fps in single player. Multiplayer is another world altogether. Any multiplayer that tries to be taken seriously and is 30fps is a joke and will surely fail after two weeks of casual population.
 
this is a fair point... on consoles there will always be 30fps games that make 60fps games look bad so some publishers will want their game to look better than other games thus forcing 30fps

i respect any dev that targets 60fps solid on consoles.

only way to counter this is to not accept 30fps games on consoles anymore

That won't work on this hardware. Some people do want good looking games, especially in single player.

On the plus side if you only buy fully 60fps games on these consoles you'll save yourself a ton of money.
 
GAF, as much as you would love to, and as much as you say it, 30 fps is not unplayable.
But for safety, get your eyes checked.

You're welcome = )

It is definitely not. I am more than happy with games that run at 30 fps, it's just that 60 fps is so much better that a slight resolution compromise is not something that I feel is worth splitting hairs over. Especially when it is 900p, which still looks great.
 
It is definitely not. I am more than happy with games that run at 30 fps, it's just that 60 fps is so much better that a slight resolution compromise is not something that I feel is worth splitting hairs over. Especially when it is 900p, which still looks great.

I agree on the compromise. These consoles aren't powerful, so you have to prioritize properly. This is a great way to do that.


That said, 900p doesn't look good. Let alone, "great".
 
I like how these threads really bring out the stupidity. "60fps isn't better than 30fps!" "You can't even see the difference!". Lol.


PC 1 is running Game A at medium settings @ 60fps
PC 2 is running Game A at medium settings @ 30fps

Nobody will say that 30fps is better in this case. Another example:

PC3 is running Game B at low settings @60fps
PC3 is running Game B at high settings @30fps

You cannot say that 60fps is better in this case just because it's a faster refresh rate. Some people might like the fluidity of 60fps enough to excuse the low settings of the game. Some people might take the hit in FPS in order to play the game at high settings. It's subjective what people like. In a closed system tradeoffs have to be made. Why is it so hard for you to understand this? Certain people might not like the tradeoffs made vs. the fps bump and other people don't mind the tradeoff in order to achieve higher framerate.
 
PC 1 is running Game A at medium settings @ 60fps
PC 2 is running Game A at medium settings @ 30fps

Nobody will say that 30fps is better in this case. Another example:

PC3 is running Game B at low settings @60fps
PC3 is running Game B at high settings @30fps

You cannot say that 60fps is better in this case just because it's a faster refresh rate. Some people might like the fluidity of 60fps enough to excuse the low settings of the game. Some people might take the hit in FPS in order to play the game at high settings. It's subjective what people like. In a closed system tradeoffs have to be made. Why is it so hard for you to understand this? Certain people might not like the tradeoffs made vs. the fps bump and other people don't mind the tradeoff in order to achieve higher framerate.


I always prioritize 60+ fps over graphics, usually aim for 120 (with lightboost), then up the graphics till it waivers
 
Dice are good but not the best technically.

I can't believe ND have pushed the PS4 to it's limits with their first game. Just look at Uncharted 1 vs Uncharted 3.

Still l wish this was a game 1080p 60fps throughout.

Cell, deferred paradigm switch, programmable shaders. Those contextual clues should make it obvious why an uncharted 1 > Uncharted 3 visual jump is probably not too likely. Let alone UC1 to UC2.
 
Cell, deferred paradigm switch, programmable shaders. Those contextual clues should make it obvious why an uncharted 1 > Uncharted 3 visual jump is probably not too likely. Let alone UC1 to UC2.
You're right about the Cell SPUs (UC1 30% -> UC2 100% usage), but UC1 had programmable shaders running on the RSX GPU.
 
Dice are good but not the best technically.

I can't believe ND have pushed the PS4 to it's limits with their first game. Just look at Uncharted 1 vs Uncharted 3.

Still l wish this was a game 1080p 60fps throughout.
Because you're misinterpreting what "pushed to the limits" really means...

Just about all games push the hardware to ita limits...any game with framerate issues is pushing the PS4 BEYOND its limits..

UC1 pushed the PS3 to ita limits just like UC3 did, but its about how much better ND got at writing code for the hardware between the 2 games...
 
PC 1 is running Game A at medium settings @ 60fps
PC 2 is running Game A at medium settings @ 30fps

Nobody will say that 30fps is better in this case. Another example:

PC3 is running Game B at low settings @60fps
PC3 is running Game B at high settings @30fps

You cannot say that 60fps is better in this case just because it's a faster refresh rate. Some people might like the fluidity of 60fps enough to excuse the low settings of the game. Some people might take the hit in FPS in order to play the game at high settings. It's subjective what people like. In a closed system tradeoffs have to be made. Why is it so hard for you to understand this? Certain people might not like the tradeoffs made vs. the fps bump and other people don't mind the tradeoff in order to achieve higher framerate.

Class in session. READ THIS Y'ALL!
 
You're right about the Cell SPUs (UC1 30% -> UC2 100% usage), but UC1 had programmable shaders running on the RSX GPU.

But it was their first game to have programmable shaders in an era when console devs were making over the siwtch and learning the ropes. It makes sense that their first attempt at it would not be as refined as later ones.

This gen does not have that really, beyond computer shaders... but that paradigm is not nearly as huge.
 
I don't feel like UC MP needs 60fps, personally. Might have been better to make it consistent seeing as Gears Ultimate is pretty jarring when switching from one to the other.
 
Because you're misinterpreting what "pushed to the limits" really means...

Just about all games push the hardware to ita limits...any game with framerate issues is pushing the PS4 BEYOND its limits..

UC1 pushed the PS3 to ita limits just like UC3 did, but its about how much better ND got at writing code for the hardware between the 2 games...

You are right generally, but uc1 didn't push the PS3 to the limits like most early PS3 games due to the unfamiliar nature of the cell.
 
I agree on the compromise. These consoles aren't powerful, so you have to prioritize properly. This is a great way to do that.


That said, 900p doesn't look good. Let alone, "great".

I sure does to me. But I also am not so hung up on resolution that I rip my hair out over the difference between 1080p and 900p. It is not a big enough difference for me to give a shit. I would much rather have a game that runs and plays better at 900p 7 days a week and twice on Sunday.
 
PC 1 is running Game A at medium settings @ 60fps
PC 2 is running Game A at medium settings @ 30fps

Nobody will say that 30fps is better in this case. Another example:

PC3 is running Game B at low settings @60fps
PC3 is running Game B at high settings @30fps

You cannot say that 60fps is better in this case just because it's a faster refresh rate. Some people might like the fluidity of 60fps enough to excuse the low settings of the game. Some people might take the hit in FPS in order to play the game at high settings. It's subjective what people like. In a closed system tradeoffs have to be made. Why is it so hard for you to understand this? Certain people might not like the tradeoffs made vs. the fps bump and other people don't mind the tradeoff in order to achieve higher framerate.


That's it right there. Neither side is wrong either. It's just what they prefer.
 
PC 1 is running Game A at medium settings @ 60fps
PC 2 is running Game A at medium settings @ 30fps

Nobody will say that 30fps is better in this case. Another example:

PC3 is running Game B at low settings @60fps
PC3 is running Game B at high settings @30fps

You cannot say that 60fps is better in this case just because it's a faster refresh rate. Some people might like the fluidity of 60fps enough to excuse the low settings of the game. Some people might take the hit in FPS in order to play the game at high settings. It's subjective what people like. In a closed system tradeoffs have to be made. Why is it so hard for you to understand this? Certain people might not like the tradeoffs made vs. the fps bump and other people don't mind the tradeoff in order to achieve higher framerate.

Yes I can, because framerate has a huge effect on control fluidity. Faster response times and all that jazz.
 
But it was their first game to have programmable shaders in an era when console devs were making over the siwtch and learning the ropes. It makes sense that their first attempt at it would not be as refined as later ones.

This gen does not have that really, beyond computer shaders... but that paradigm is not nearly as huge.
Yeah, UC2 and UC3 had better shaders.

What about async compute though? Maxing out those 8 ACEs/64 queues ain't easy, is it?

Will UC4 support asynchronous shaders?
 
Dice are good but not the best technically.

I can't believe ND have pushed the PS4 to it's limits with their first game. Just look at Uncharted 1 vs Uncharted 3.

Still l wish this was a game 1080p 60fps throughout.

The Cell was very hard to programm.

The standard x86 architecture isn´t ...

They will of course get more out of it with their next game but don´t expect a leap as UC1 to UC2/3
 
PC 1 is running Game A at medium settings @ 60fps
PC 2 is running Game A at medium settings @ 30fps

Nobody will say that 30fps is better in this case. Another example:

PC3 is running Game B at low settings @60fps
PC3 is running Game B at high settings @30fps

You cannot say that 60fps is better in this case just because it's a faster refresh rate. Some people might like the fluidity of 60fps enough to excuse the low settings of the game. Some people might take the hit in FPS in order to play the game at high settings. It's subjective what people like. In a closed system tradeoffs have to be made. Why is it so hard for you to understand this? Certain people might not like the tradeoffs made vs. the fps bump and other people don't mind the tradeoff in order to achieve higher framerate.

Yep, pretty much. Before I upgraded my graphics card, I did this exact trade off with Witcher 3 to be able to play it with it's beautiful graphics at a locked 30 FPS rather then go for lower quality at a more consistent 60 FPS.
 
PC 1 is running Game A at medium settings @ 60fps
PC 2 is running Game A at medium settings @ 30fps

Nobody will say that 30fps is better in this case. Another example:

PC3 is running Game B at low settings @60fps
PC3 is running Game B at high settings @30fps

You cannot say that 60fps is better in this case just because it's a faster refresh rate. Some people might like the fluidity of 60fps enough to excuse the low settings of the game. Some people might take the hit in FPS in order to play the game at high settings. It's subjective what people like. In a closed system tradeoffs have to be made. Why is it so hard for you to understand this? Certain people might not like the tradeoffs made vs. the fps bump and other people don't mind the tradeoff in order to achieve higher framerate.

Even with this truth thrown at faces, some people will still negate the existence of personal preferences.

Well no shit, I was stating my own preference too. I think that's pretty much a given in a discussion forum?

It's far from a given, since a lot of people (including myself a lot of times) says subjective things in the form of "X is Y because I say so" instead of something like "I think that... " or "for me,...".

But we have also the problem that people doesnt agree in what is subjective or objective.
 
Yeah, UC2 and UC3 had better shaders.

What about async compute though? Maxing out those 8 ACEs/64 queues ain't easy, is it?

Will UC4 support asynchronous shaders?

If they didn't wouldn't that be quite surprising? I mean, it is a studio that prides itself on its technology utilization since UC2... I doubt they are leaving all that silicon on the table un-utilized.
I do not see why they wouldn't.
 
Even with this truth thrown at faces, some people will still negate the existence of personal preferences.
I guess you wont be playing the U4 multiplayer for it being too hideous then? Its not even going to be a high/low setting difference between it and the campaign either.
 
This is brought up all the time, but we really have no idea what it is and how it functions. It would be nice if we actually knew.

So has Evolution been cagey about their tech stuff or what? Seems odd given that they made one of best looking games in existence. You would think they would be eager to shout about it from the rooftops.

Crytek certainly enjoy doing that.
 
Top Bottom