• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

[Digital Foundry] Performance Analysis: Assassin's Creed Syndicate

I'm just saying if this engine can barely handle 30 fps at 900p, they should rethink of their graphic priorities. Even The Witcher 3 seems better and it's even more compromise compared the PC.
Just to be precise, I can live with 900p but the final results in Syndicate it's not even that appreciable.
We don't know what framerate can both console handle without unlocked builds. I find your complaints very strange.

the witcher also got performance patch on ps4 hell ac4 was 900p on ps4 lol
how those are not similar cases lol

Hum, AK PC cannot be compared to Syndicate on consoles. One was broken at launch, the other is not. One clearly does not run well in relation to the hardware, the other actually does to me. Like, I'm not surprised at all Syndicate needs to make compromises to achive its targets on consoles. AK performance jumped by a massive 50-60% in my case, I'm pleased with how the game runs (60fps mostly stable with near max settings).
What makes you think Syndicate leaves performance on the table ?

DX11 - nothing is DX12 yet and won't be until mid-late 2016.
Fable Legends - Q4 2015 is DX12
Gears UE also.

But Syndicate will likely be DX11 only.
 
there is simply nothing all that cutting edge on either console. They both use relatively weak CPU setups, and their GPU's are pretty middle of the road...

I couldn't care less about any problems with Arkham Knight on the PC because I don't game on a PC...but if there are across the board problems with the game running, even on bleeding edge PC hardware, than its a problem with the game...



whats your point? youre making apples to oranges comparisons when you say "most games." There are 900p games out there on the PS4, and from major AAA studios as well...

Of course i never actually said 1080p was impossible, but people would in here complaining about the framerate if they picked 1080p and it dropped more frames than the Xbone version...

Im not defending Ubi, but they are in a lose lose situation...
900p games are not that common on ps4. I don't understand why suddenly it's just an hardware fault when there are a very minority which runs on 900p. I'm not saying ubisoft it's incompetent. But I don't like the preconception it's just an hardware fault when couldn't be true at all
We don't know what framerate can both console handle without unlocked builds. I find your complaints very strange.
Uh. It's strange to complain of an engine who can barely handle 30 fps at 900p on console? I think it's quite normal.
 
900p games are not that common on ps4. I don't understand why suddenly it's just an hardware fault when there are a very minority which runs on 900p. I'm not saying ubisoft it's incompetent. But I don't like the preconception it's just an hardware fault when couldn't be true at all.
Or it could very well be the culprit.

There is nothing surprising about 900p on PS4, all it takes is the right workload.

Did you burst a tit when Battlefront was revealed to be 900p as well ?
 
You wanted a stable 30fps ? Now you got it, and the game performs worse on Xbone so it's not "parity".

if you make a game run with exactly the same settings (parity) it doesnt mean its not still parity when those settings cause the other console to chug... it means the other console cant handle the engine running the same settings as PS4.
 
900p games are not that common on ps4. I don't understand why suddenly it's just an hardware fault when there are a very minority which runs on 900p. I'm not saying ubisoft it's incompetent. But I don't like the preconception it's just an hardware fault when couldn't be true at all.
Yea, when you do get 900P from other developers at least the game runs much better with a good IQ and it looks far more appealing visually, like with Battlefront, I can't really compliment Ubi on this one.
 
if you make a game run with exactly the same settings (parity) it doesnt mean its not still parity when those settings cause the other console to chug... it means the other console cant handle the engine running the same settings as PS4.

Which means it's not parity. One runs better than the other.
 
900p games are not that common on ps4. I don't understand why suddenly it's just an hardware fault when there are a very minority which runs on 900p. I'm not saying ubisoft it's incompetent. But I don't like the preconception it's just an hardware fault when couldn't be true at all

Uh. It's strange to complain of an engine who can barely handle 30 fps at 900p on console? I think it's subjective here.

Nobody is saying that the hardware is completely at fault, but its the major player. You are correct that 900p games are rare on the PS4...but you cant simply point to that as evidence of why a game should or should not be 900p. You simply dont know enough about whats going on under the hood of this particular game to make that argument. I have no doubt that further optimization could have helped, maybe even helped a lot. But you can only do so much with set hardware
 
And 30fps can't be demanding then ? I'm sorry but it's totally possible you are overestimating how capable consoles are.

Obviously, I could be wrong too.
I overestimate console saying this engine is uselessly expensive in such low profile hardware?
I have repeated different time I don't like this engine for this reason and still I continue to dislike it because don't give appreciable results to accept its compromises.
I'm not saying there is no excuse for such performance.
 
Seems to be a recurring theme with CPU intensive games on consoles this gen. As we all know, DR3 absolutely hammered the XBone at launch with its thousand zombie on one screen touted gameplay.
 
I overestimate console saying this engine is useless expensive in such low profile hardware?

Rendering such open world games require hardware ressources and from what I've seen Ubisoft had their priorities right, they got rid of those needlessly draining huge crowds for instance.

I disagree with the notion that they have not used ressources at their disposal wisely.
 
Why do so many people always assume every game is GPU limited? Both systems have the same amount of RAM and a pretty similar CPU. If the game is running into memory limitations at 900p then they aren't going to magically make it run at a higher resolution on PS4, regardless of the better GPU. I swear people just dig for things to be angry about these days while pushing aside any kind of rational thought.
 
Funny how when the marketing deal changes so does performance.

As if the difference in performance would be worth the money.
Also Unity was patched several times and there wasn't a big difference afterwards. Let's see how this will turn out.
 
Seems to be a recurring theme with CPU intensive games on consoles this gen. As we all know, DR3 absolutely hammered the XBone at launch with its thousand zombie on one screen touted gameplay.
DR3 give me the suspect it's really horribly developed. Zombie hasn't particular elaborate AI to justify such compromise.
 
People are still blaming CPU spec for pushing a lower number of (not that great quality) pixels?

Riiight.
 
I'm just saying if this engine can barely handle 30 fps at 900p, they should rethink of their graphic priorities. Even The Witcher 3 seems better and it's even more compromise compared the PC.
Just to be precise, I can live with 900p but the final results in Syndicate it's not even that appreciable.

The thing is that you can't load on a system by just looking at the game. There is a lot of calculation done in the small time frame that 30fps give you.
 
People are still blaming CPU spec for pushing a lower number of (not that great quality) pixels?

Riiight.

Well, to say it is not, one would have to have a pc and investigate how much the engine relies on CPU performance, testing different setups.
 
So you would have been fine with the PS4 dropping more frames than the Xbone because of better visuals ?

id rather have cleaner LOD and 1080 @ solid 30 with whatever sacrifices they had to make (which would still look better than Black Flag, and black flag looked great... but no they over promised and now everyone is expecting visuals that arent achievable from ubisoft.)
 
Sounds fine.
I'm ok with the cutbacks if it results in a performance that's closer to 30fps. And it seems like it did.
 
The engine being terrible also doesn't help.

Meanwhile MGS V runs at 1080/60.

MGS5 is pretty much using last gen tech, its world is also not even close to being half as complex and dense as the one in AC and nor does it have to deal with as many NPCs.
 
So what did they do to make this one performance worse than Unity on xbone as they even scaled back the load? It's clear that it should perform worse than ps4 but performing worse on the same console with the same constraints sounds odd.
 
It's such a step down visually from Unity. I guess that's the cost for a more stable frame rate and a pointless day night cycle.
 
So, two years/release-cycles on and we've lost resolution and frame-rate consistency versus the first release on this generation of hardware. Whatever is being gained in return had better be pretty great.
 
Why do so many people always assume every game is GPU limited? Both systems have the same amount of RAM and a pretty similar CPU. If the game is running into memory limitations at 900p then they aren't going to magically make it run at a higher resolution on PS4, regardless of the better GPU. I swear people just dig for things to be angry about these days while pushing aside any kind of rational thought.
What's special about this game's deferred rendering setup that they'd be struggling with a 1080p30 frame buffer?

To be fair, MGSV is pretty barren as far as open worlds go and they aren't rendering as many AI at once as the AC games do.
Not particularly barren inside bases and it hardly struggles at 1080p60.
 
So, two years/release-cycles on and we've lost resolution and frame-rate consistency versus the first release on this generation of hardware. Whatever is being gained in return had better be pretty great.

Yes because things like these never happened before in previous gen and every game this gen is doing this....right.
 
id rather have cleaner LOD and 1080 @ solid 30 with whatever sacrifices they had to make (which would still look better than Black Flag, and black flag looked great... but no they over promised and now everyone is expecting visuals that arent achievable from ubisoft.)

LOD is also GPU dependant, so at 1080p LOD might have needed to be actually toned down compared to what it is now. I don't think the general gamer cares about res so Ubisoft did the right thing : priorise visuals and scale over resolution.
 
Destiny says hello. I think Fallout 4 was also confirmed as 1080p/30 on both.

Also something at 900/30 does not mean parity, frame rate can be wildly different.

At least if both are 1080p at the target framerate, that means both consoles hit the goals they set. I'm fine with that.

It just seems particularly odd here given that there isn't a huge delta in framerate at 900p and the AA and AF are relatively low as well. You'd think they'd at least try to clean up the IQ from Unity.
 
And 30fps can't be demanding then ? I'm sorry but it's totally possible you are overestimating how capable consoles are.

Obviously, I could be wrong too.
If Battlefield 4 was 1080p60fps then this discussion would be comparable.

Black Flag was 1080p30fps (and solid at that).
 
If Battlefield 4 was 1080p60fps then this discussion would be comparable.

Black Flag was 1080p30fps (and solid at that).
Black Flag was much more polished in general. Only the character faces were especially ugly and that's more art than tech.
 
Top Bottom