• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Digital Foundry: Face-Off: Assassin's Creed Unity

Are you really pulling the "20 fps is fine for me so it's not an issue" card?

Well, it's the PS4 version that is most often at 20fps, not the Xbox One version, but that really isn't saying much. The Xbox One version seems to be consistently below 30fps; it usually hovers around 25fps outside when there are crowds around with occasional dips below that. However, digital foundry also reports that the Xbox One version has the lowest fps drops recorded.
 
I'm 16 hours into the Xbox One version and the framerate is not an issue. At all. So yeah, it can be overstated, you just did.

The only situations I found where the framerate was really bad was in some viewpoint synchronizations (where it is not a problem since you are not controlling the character anyway). Other than that it is perfectly fine and has never caused any control issues for me.

That being said, those videos in the Digital Foundry article show how in the PS4 the framerate can certainly be an issue. I assume you are playing on PS4?



Congratulations for getting a version that has no framerate issues, at all. People in the OT are having issues with it in both versions. How did you get one where the "framerate is not an issue?"
 
People expecting Ubisoft to release a magic patch that will bring this game up 10+fps are kidding themselves... Ubisoft isn't planning on a long term post-support schedule just like they didn't for Watch Dogs, or Black Flag, or any of their other games.

At best we'll see a minor performance patch that will fix some of the stutters, but if you believe Ubi is going to put the extra 3-5 months dev time they should have put into it after they've already gotten all the money they are going to get, you're just kidding yourself.
 
People expecting Ubisoft to release a magic patch that will bring this game up 10+fps are kidding themselves... Ubisoft isn't planning on a long term post-support schedule just like they didn't for Watch Dogs, or Black Flag, or any of their other games.

At best we'll see a minor performance patch that will fix some of the stutters, but if you believe Ubi is going to put the extra 3-5 months dev time they should have put into it after they've already gotten all the money they are going to get, you're just kidding yourself.
They are in a much worse position right now this does not compare to watchdogs.
 
Yeah. Something I mentioned in the previous thread.

The statement he made about locking the game at the same spec to avoid debates and stuff, was indeed referring to complete parity. Some people made the argument that it may not mean complete parity just the resolution and framerate was the same. It turns out, that is indeed what he meant.

Of course we now know it wasn't simply to avoid debates and stuff, it was largely to do with the game being in an unfinished state. I postulated that he probably said, "to avoid debates and stuff" to deflect attention away from the issue. Even the talk about the CPU bottleneck, while true, is only half the story. The game is also a buggy mess, and it seems there are times when framerate issues aren't entirelyfully understood or explained away by the CPU bottleneck.

Obviously he was never going to say the truth at the time, that the game was a fucking mess and that they were planning to release it in that state (lol). This also explains the sneaky language used in Ubisoft's official PR statement.

Yes it's clear that Ubi PR was doing preemptive damage control by pretending they actually had some sense of control in their own development process, when in fact they were told to leave the washroom and finish taking their shit outside.
 
I think the thing that troubles me most about this is the PR beforehand. It isn't even the horrible parity statements. It's things like:

This is notable because the team has dedicated much of the past few months to optimizing Unity to reach 900p with a consistent 30 frames per second.

Assassin's Creed: Unity has been engineered from the ground up for next-generation consoles. Over the past four years, we have created Assassin's Creed: Unity to attain the tremendous level of quality we have now achieved on Xbox One, PS4 and PC. It's a process of building up toward our goals, not scaling down, and we're proud to say that we have reached those goals on all SKUs.

Now, I know it is press speak, but to me this has really impacted my feelings towards the quality of their products. I'm fortunate enough to have been able to choose to play this game on three different platforms and had to choose none based on everything that has happened the past month.
 
People expecting Ubisoft to release a magic patch that will bring this game up 10+fps are kidding themselves... Ubisoft isn't planning on a long term post-support schedule just like they didn't for Watch Dogs, or Black Flag, or any of their other games.

At best we'll see a minor performance patch that will fix some of the stutters, but if you believe Ubi is going to put the extra 3-5 months dev time they should have put into it after they've already gotten all the money they are going to get, you're just kidding yourself.

I spent the first few days thinking I'd wait to play the story missions until they fixed the framerate, but now that they've set up their live blog for upcoming patches, and they're already on patch three with no concrete plans to even address the framerate... Yeah, I don't think they're going to do it. At this point I'm just powering through the story so I can trade it in while it still has some value.
 
I still can't wrap my head around why they would be so stubborn about dropping crowd sizes in order to preserve performance.

I agree with TXAA. Its even worse than FXAA. But i love MSAA. MSAA is much sharper and im more than willing to accept some more shimmering edges as a trade off.

The only downside of MSAA is it cant tackle shader aliasing. SSAA could but none is using it.

i think the best trade off for consoles is SMAA T2x. It isnt as blurry as FXAA and is doing a good/better job against the jaggies.
I definitely agree that SMAA T2x is the sweet spot for consoles. Clear, good coverage, and not too taxing. I really hope we see it replace FXAA before long. I wouldn't have as much of an issue with FXAA if it actually cleaned up edges well but it often completely fails on top of blurring the image. If someone could eliminate the ghosting SMAA T2x can sometimes cause and add an element that addresses shader aliasing it would be by far the best post process solution.
 
I still can't wrap my head around why they would be so stubborn about dropping crowd sizes in order to preserve performance.


I definitely agree that SMAA T2x is the sweet spot for consoles. Clear, good coverage, and not too taxing. I really hope we see it replace FXAA before long. I wouldn't have as much of an issue with FXAA if it actually cleaned up edges well but it often completely fails on top of blurring the image.
The game drops frames with 10 people on the screen. I dont think the crowd count is the huge problem here.
 
We paired a Core i3 4130 with a GTX 750 Ti, set PC presets to console equivalents (though we swapped in HBAO+ for a bit of a quality boost) and ran at 1600x900 (900p). In essence we pitted console-level PC tech against PS4 and Xbox One and found that the game ran fairly well. We locked frame-rate to 30fps using Riva Tuner Statistics Server and, cut-scene stutters aside, frame-rates held up in 26-30fps territory.

Great that they confirmed benchmarks. It is really CPU limited game. Jaguar is a trash, it seems.
Still, it doesnt justify developer to just not decrease the settings to get it to playable state.
 
I still can't wrap my head around why they would be so stubborn about dropping crowd sizes in order to preserve performance.

The huge crowds cause reliable, repeatable drops into the low 20s for as long as you care to spend amongst them, but they aren't the only thing that kill the framerate. Certain interior levels with chug along even with no NPCs in them. You can be sauntering through a random mansion in slideshow territory the whole time. The game's an absolute mess.
 
We had BF4, then DriveClub came! It was (is) so bad that you cant justify the product. When you think the garbage is over, Ubisoft tops it all with Unity.

What is wrong with the industry.
 
Nah. Not lazy.

Harsh deadlines combined with an ambitious project = buggy project.
The PS4's performance woes can be best attributed to an allocation of resources not favoring the PS4.

That and Ubisoft being fucking morons with regard to "parity."

There's absolutely no technical reason why the PS4 version should run at the same resolution with the same graphical effects. None.
 
That and Ubisoft being fucking morons with regard to "parity."

There's absolutely no technical reason why the PS4 version should run at the same resolution with the same graphical effects. None.

But there is... Watch cutscenes framerates. They barely keep 30fps in 900p, in 1080p it would be a mess like on Xbone.

But hey, lets pretent that its a parity and not settings of this game are too high for those consoles in current optimization stage of their new engine.
 
i3mckUXJrRdAX.gif

hahaha, exactly how ubi should feel.
 
But there is... Watch cutscenes framerates. They barely keep 30fps in 900p, in 1080p it would be a mess like on Xbone.
Ok but 10% of more cpu not give an advantages of 5-6 fps in the struggle scenario. Sound absolutely crazy to me. That's a terrible optimization in ps4 hardware, lack of time, I don't know what else, but seriously I can't believe for a second you can get around 5 fps and more with only 10% of more (it's even less to be honest) speed in the cpu.
 
Ok but 10% of more cpu not give an advantages of 5-6 fps in struggle scene. Sound absolutely crazy to me. That's a terrible optimization in ps4 hardware, lack of time, I don't know what else, but seriously but I don't believe for a second you can get 5 fps with only of more 10% (it even less) of cpu speed.

Maybe if some calculations are not done in time, they add additional stalls to the pipeline that decreases performance even more than they should.
 
Had no intention on touching this with a barge pole, and this just makes sad reading.

I feel sorry for guys and girls working on this for months and months and had to ship something so unoptimised.

Simply unacceptable. I'm sure a lot of people will steer clear, but I bet it still sells well enough that Ubisoft think they've got away with it.
 
For what reason? Just to understand.

For example if You calculate A and after it finished You start to calculate C.
In other thread You calculate B and after it D.
If C depends on B and B is late than C cant start just after A and You get additional stall.

And because those stalls multiply for a whole frametime, it could increase the computation time by more just a theoretical difference between both CPUs.
 
Digital Foundry said:
On PlayStation 4, we ran into problems where sudden frame latency spikes would occur for extended periods, causing gameplay to stutter in a stop-start fashion - the issue here is that we are seeing 60-100ms pauses manifest during the action, easily noticeable when playing, and noted on our frame-time graphs

Frame time variance appears to be erratic on the ps4. Unplayable, as far as I'm concerned.

But there is... Watch cutscenes framerates. They barely keep 30fps in 900p, in 1080p it would be a mess like on Xbone.

But why not sacrifice cutscene performance for a higher resolution during gameplay? That is, assuming that the cpu is the bottleneck during most of the gameplay scenarios, going by the frame rate advantage on the xbox one.
 
Are you really pulling the "20 fps is fine for me so it's not an issue" card?
Soon these guys will be playing at 10fps and say, the framerate is fine, the game is great.

By the looks of it, that 6fps drop is unique to the xbone since they say it's a problem which the PS4 did not have.
 
How is the gameplay good? I'm genuinely curious. What makes it better than other AC games? The reviews I've read said it's more of the same with less features like no boats.
It's basically a bigger "Ezio Trilogy". Better missions variety and more ways to complete them, slightly difficult combat, better parkour (ways to move up and down, in previous games it was all aroung moving up). Bigger environments, both outdoor and indoor can be used to play missions.

In my case it's also better precisely because of "no boats". It was why III and IV put me off and was ready to ignore Unity, but glad I didn't.
 
I can't see any excuse for this. They budgeted resources for their game badly and now the consumer must pay the price. It's that simple.
 
Apparently the PS4 version holds better in coop but it has a stutter.

Xbone version falls as low as;

1920x-1

The game is just an unfinished mess. They should have made Rogue cross-gen and pushed this until fall 2015. At least with MS's money they spent a bit more time optimizing it, so xbone fans have that pyrrhic victory.
 
so the epitome of rushed game dev.

either way, we shouldn't care how rushed and time-constrained they were.

you make a project, you fucking finish it.

never heard of people who can't finish their project and get sympathy and support for it.

in the company that i work in, you meet your deadlines and you better be damn sure you finish your project, mid size or big.

imagine a project and the finished product is a building with weak foundation or cracked walls.

cpu bottleneck?, xbone cpu is 100mhz faster. if you oc your pc cpu from 3ghz to 3.1ghz it's not gonna result in a 5fps performance boost

shitty optimisation all around, in every system.

what an abomination.

actually, it should have scored lower.
 
Anyone who knew about these issues (which have been discussed for a good while) and still bought the game really upset me, because it validates releasing shitty unfinished games. We reap what we sow, stuff like this should not be supported.
 
They probably spent so much time on making the X1 version semi-playable that other versions were completely neglected. :
 
But there is... Watch cutscenes framerates. They barely keep 30fps in 900p, in 1080p it would be a mess like on Xbone.

But hey, lets pretent that its a parity and not settings of this game are too high for those consoles in current optimization stage of their new engine.

I don't think you really understand how hardware differences impact performance. But okay.

The game is obviously CPU-bound, which is what is causing the framerate issues, but resolution has practically no impact on CPU performance. There are other ways in which they could have improved visuals on PS4 without impacting CPU performance as well. There's no scenario other than developer incompetence or "forced parity" (aka the publisher making a dumbfuck decision) where the equivalent of a Radeon 7770 perfectly keeps up with the equivalent of a Radeon 7850. Even Digital Foundry is baffled by it. Stop lumping both consoles together as if they have the same hardware. The PS4 GPU has major advantages that translate into better visuals with minimal effort.
 
For example if You calculate A and after it finished You start to calculate C.
In other thread You calculate B and after it D.
If C depends on B and B is late than C cant start just after A and You get additional stall.

And because those stalls multiply for a whole frametime, it could increase the computation time by more just a theoretical difference between both CPUs.

Frankly, your theorycrafting looks to be almost up there with "But it's not just a 9% CPU clock advantage! It's 9% -per core-."

(The 9% not even being a certainty in real case scenarios I might add because:
- The PS4 CPU clock was never confirmed, it could very well be 1.75Ghz as well. We just do not know.
- The XB1 has 3x OSes and the snap feature which eat more CPU time than the PS4's OS with ... no Snap feature.). And No debates and stuff, too, of course.

What we have here is a case of platform preference (for lead dev and optimization) mainly.

And if we talk paranoia factor, I could add a potential anchor placed on the PS4's neck to explain a surprising (up to) 25% fps difference deficiency, but that would be crazy talk. Right?
 
We had BF4, then DriveClub came! It was (is) so bad that you cant justify the product. When you think the garbage is over, Ubisoft tops it all with Unity.

What is wrong with the industry.
As an ex coder I've got to say it's unfair to love Driveclub with those games. Driveclub passed Beta and stress tests and was released in good faith with no idea there was a bug that wouldn't surface until it was live.

EA and in particular Unisoft with Unity knowingly shoved seriously fkawed products out, Unisoft going so far as to use PR and review embargos to hide the mess until after the first wave of people bought it.

Something like Driveclub will always happen from time to time and they've been probing it and offering free DLC.

Unisoft so far have done squat apart from imply the may refrain from abusing review embargos.

Unity and Driveclub are very different cases in context.
 
For example if You calculate A and after it finished You start to calculate C.
In other thread You calculate B and after it D.
If C depends on B and B is late than C cant start just after A and You get additional stall.

And because those stalls multiply for a whole frametime, it could increase the computation time by more just a theoretical difference between both CPUs.

LOL! You're example doesn't make any sense. The difference will always be 9%.

I don't think you really understand how hardware differences impact performance. But okay.

The game is obviously CPU-bound, which is what is causing the framerate issues, but resolution has practically no impact on CPU performance. There are other ways in which they could have improved visuals on PS4 without impacting CPU performance as well. There's no scenario other than developer incompetence or "forced parity" (aka the publisher making a dumbfuck decision) where the equivalent of a Radeon 7770 perfectly keeps up with the equivalent of a Radeon 7850. Even Digital Foundry is baffled by it. Stop lumping both consoles together as if they have the same hardware. The PS4 GPU has major advantages that translate into better visuals with minimal effort.

Bu...b...but cut scenes will have framerate issues?
jk
 
I don't think you really understand how hardware differences impact performance. But okay.

The game is obviously CPU-bound, which is what is causing the framerate issues, but resolution has practically no impact on CPU performance. There are other ways in which they could have improved visuals on PS4 without impacting CPU performance as well. There's no scenario other than developer incompetence or "forced parity" (aka the publisher making a dumbfuck decision) where the equivalent of a Radeon 7770 perfectly keeps up with the equivalent of a Radeon 7850. Even Digital Foundry is baffled by it. Stop lumping both consoles together as if they have the same hardware. The PS4 GPU has major advantages that translate into better visuals with minimal effort.
Totally agree.

This game is in shambles. PS4 version running as it does is a disgrace. If you plan on buying this game, please buy used.

I understand that some people enjoy their yearly dose of this stale/rehashed/soul-less franchise, but please don't give these people your money. They should be held accountable for releasing an unfinished product. The only way to make them understand that this bullshit is not acceptable is to not support them with your money.
 
Frankly, your theorycrafting looks to be almost up there with "But it's not just a 9% CPU clock advantage! It's 9% -per core-."

(The 9% not even being a certainty in real case scenarios I might add because:
- The PS4 CPU clock was never confirmed, it could very well be 1.75Ghz as well. We just do not know.
- The XB1 has 3x OSes and the snap feature which eat more CPU time than the PS4's OS with ... no Snap feature.). And No debates and stuff, too, of course.

What we have here is a case of platform preference (for lead dev and optimization) mainly.

And if we talk paranoia factor, I could add a potential anchor placed on the PS4's neck to explain a surprising (up to) 25% fps difference deficiency, but that would be crazy talk. Right?
You do realize that both the PS4 and Xbox One have 2 cores dedicated to the OS. How much the OS actually uses of those 2 cores is irrelevant, the remaining 6 cores are unaffected. So saying snap is eating CPU time doesn't matter since the game can't access those 2 cores anyway. KKRT00's example is sound.
 
So sad to see the gaming community willingly spending money on shit like this.
No wonder they ship half finished unoptimised games these days, people will buy it anyway.

Shame on you !


Vote with your wallets people, DO not support anti consumer companies.
 
You do realize that both the PS4 and Xbox One have 2 cores dedicated to the OS. How much the OS actually uses of those two cores irrelevant the remaining 6 cores are unaffected. So saying snap is eating CPU time that matter since the game couldn't acces those 2 cores anyway. KKRT00's example is sound.

It's not sound at all, you should read project management courses (PERT chart), because things don't work like he said. If you increase the time of each task by 10%, the time of the whole whole process will only increase by 10%.
 
Top Bottom