• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Digital Foundry: Face-Off: Assassin's Creed Unity

The truly sad thing is If you had asked me a year ago, I would have said that the frame rate issues that plagued assassins creed games would be gone with the new platforms, and yet, here we are. As least they got BF right I guess.

Ubisoft son, I am disappoint.

I had the same experience. The older games ran like ass on the PS3 and I really figured that they'd have it sorted out for the new machines. They, quite obviously, didn't.
 
I wonder when and how much they'll manage to improve performance. Going from 20ish fps to locked 30fps seems like a crazy person's pipe dream right now.
 
I think I have an explanation for the increased blurriness people are experiencing with the game.. any people in the "know" can jump in anytime and correct me if I'm wrong.

I was messing with settings last night in the game and noticed some really interesting things.

Firstly, I have a 4k TV so 4k is it's native resolution.

Playing the game at 1080p is a blurry mess for me since I believe the TV is upscaling it to fit the 4k screen. So not only are the pixels being AA from the FXAA, but the TV is also upscaling the image.

Switching to 1440p gives a dramatic difference in clarity as the TV doesn't have to upscale as much. However, going to a full 4k resolution, I am seeing the sharpest clarity in the game with a perfect 1:1 ratio pixel rendered to tv raster pixel.

I'm assuming most people have 1080p HDTVs and are seeing not only an AA rendered framebuffer, but also their TVs are upscaling the 900p res to 1080p causing the blurriness.

Just my 2 cents
 
I'm assuming most people have 1080p HDTVs and are seeing not only an AA rendered framebuffer, but also their TVs are upscaling the 900p res to 1080p causing the blurriness.

You must have successfully avoided all resolutiongate threads over the last year. ;-)
 
I think I have an explanation for the increased blurriness people are experiencing with the game.. any people in the "know" can jump in anytime and correct me if I'm wrong.

I was messing with settings last night in the game and noticed some really interesting things.

Firstly, I have a 4k TV so 4k is it's native resolution.

Playing the game at 1080p is a blurry mess for me since I believe the TV is upscaling it to fit the 4k screen. So not only are the pixels being AA from the FXAA, but the TV is also upscaling the image.

Switching to 1440p gives a dramatic difference in clarity as the TV doesn't have to upscale as much. However, going to a full 4k resolution, I am seeing the sharpest clarity in the game with a perfect 1:1 ratio pixel rendered to tv raster pixel.

I'm assuming most people have 1080p HDTVs and are seeing not only an AA rendered framebuffer, but also their TVs are upscaling the 900p res to 1080p causing the blurriness.

Just my 2 cents

Pretty sure PS4 and XB1 don't actually have 900p video output modes.
 
How is shit like this acceptable in a final product?

efnnl05.png
Ubisoft can accomplish great feats.
for some reason i thought we already had a DF article on this, they didnt mention the crashes, interesting that the PS4 seems to outperform the Xbox One version on certain cases.
 
Pretty sure PS4 and XB1 don't actually have 900p video output modes.

It should be irrelevant whether the GPU's upscaler scales up the image or your AVR or TV's one. Only s clever in-game software scaler that has more information available than the final framebuffer should be able to make s noticable difference.
 
How is shit like this acceptable in a final product?

efnnl05.png

Thats some Skyrim PS3 shit right there.

Only its that bad on both platforms.



Jesus Ubisoft. And after Black Flag as well which was one of the most technically stable and artistically beautiful games ever made.
 
Non native resolutions have been an issue for years, I had a 1080p tv for a large part of last gen.

I'm sure that's the main reason why lots of people are keying in on the benefits of 1080p this gen. Console hardware is finally at the point where native resolution is within the realm of possibility.
 
I think I have an explanation for the increased blurriness people are experiencing with the game.. any people in the "know" can jump in anytime and correct me if I'm wrong.

I was messing with settings last night in the game and noticed some really interesting things.

Firstly, I have a 4k TV so 4k is it's native resolution.

Playing the game at 1080p is a blurry mess for me since I believe the TV is upscaling it to fit the 4k screen. So not only are the pixels being AA from the FXAA, but the TV is also upscaling the image.

Switching to 1440p gives a dramatic difference in clarity as the TV doesn't have to upscale as much. However, going to a full 4k resolution, I am seeing the sharpest clarity in the game with a perfect 1:1 ratio pixel rendered to tv raster pixel.

I'm assuming most people have 1080p HDTVs and are seeing not only an AA rendered framebuffer, but also their TVs are upscaling the 900p res to 1080p causing the blurriness.

Just my 2 cents

Thanks for the explanation I guess?

Also, I thought 1080p scales well to 4k displays since its exactly 4x.
 
After shaking off the "parity to avoid debates" salt, I was willing to pick this up day one.. but sub 30 frames per second and various other bugs Ubisoft can forget about it.
 
I wonder if this means that the next AC game will go back to last-gen scale? That would be sincerely disappointing. This game is the first time I've thought that Sony and MS really shit the bed including such a useless CPU just to save money.
 
Overall, it's very difficult to avoid the sense that Assassin's Creed Unity was released in an unfinished state. On PC in particular, it really feels like beta code - feature-complete, but lacking in optimisation, with bugs manifesting regularly. On console, the game is more stable, but clearly performance is unacceptable - the frequent dips to 25fps on Xbox One are jarring enough, but it remains truly remarkable that the PS4 game should drop just as often to 20fps

26a420cfbaf5460f_image.png.xxxlarge.jpg

.
 
So the parity talk was real after all, no differences whatsoever between Xbox One and PS4. They're identical graphics wise.

Pretty disappointing.
 
So the parity talk was real after all, no differences whatsoever between Xbox One and PS4. They're identical graphics wise.

Pretty disappointing.

and what did you expect? parity is a real issue, we must kill it right now, and tbh i totally expected this from ubisoft.
 
So the parity talk was real after all, no differences whatsoever between Xbox One and PS4. They're identical graphics wise.

Pretty disappointing.

Parity was a lie that went in the opposite direction. Xbox performs better than PS4. They obviously spent more time optimizing Xbox because there's no logical explanation why the PS4 would perform worse. Even DF states that the minor CPU advantage does not explain away the major FPS difference.
 
One just has to wonder what kind of state Watch Dogs was in to convince Ubisoft to delay it a good 6 months if they've allowed this be printed on discs.
 
and what did you expect? parity is a real issue, we must kill it right now, and tbh i totally expected this from ubisoft.

Well I thought they were deliberately downplaying graphical differences accross skus like it's so common, I'm genuinely surprised to see both PS4 and Xbox One completely identical.
 
Parity was a lie that went in the opposite direction. Xbox performs better than PS4. They obviously spent more time optimizing Xbox because there's no logical explanation why the PS4 would perform worse. Even DF states that the minor CPU advantage does not explain away the major FPS difference.

The game is not finished and is unoptimized. If the game were given another 6 months to a year to cook, it probably would look like most other faceoffs we've seen before.
 
some people are reporting that going offline fixes that, i was expecting to see a mention of this in the DF article, but they dont bring it up.

They mentioned it in a previous article. Not worth mentioning again because their tests proved otherwise.
 
I think I have an explanation for the increased blurriness people are experiencing with the game.. any people in the "know" can jump in anytime and correct me if I'm wrong.

I was messing with settings last night in the game and noticed some really interesting things.

Firstly, I have a 4k TV so 4k is it's native resolution.

Playing the game at 1080p is a blurry mess for me since I believe the TV is upscaling it to fit the 4k screen. So not only are the pixels being AA from the FXAA, but the TV is also upscaling the image.

Switching to 1440p gives a dramatic difference in clarity as the TV doesn't have to upscale as much. However, going to a full 4k resolution, I am seeing the sharpest clarity in the game with a perfect 1:1 ratio pixel rendered to tv raster pixel.

I'm assuming most people have 1080p HDTVs and are seeing not only an AA rendered framebuffer, but also their TVs are upscaling the 900p res to 1080p causing the blurriness.

Just my 2 cents

The John Madden of resolutiongate
 
So the parity talk was real after all, no differences whatsoever between Xbox One and PS4. They're identical graphics wise.

Pretty disappointing.

Yeah. Something I mentioned in the previous thread.

The statement he made about locking the game at the same spec to avoid debates and stuff, was indeed referring to complete parity. Some people made the argument that it may not mean complete parity just the resolution and framerate was the same. It turns out, that is indeed what he meant.

Of course we now know it wasn't simply to avoid debates and stuff, it was largely to do with the game being in an unfinished state. I postulated that he probably said, "to avoid debates and stuff" to deflect attention away from the issue. Even the talk about the CPU bottleneck, while true, is only half the story. The game is also a buggy mess, and it seems there are times when framerate issues aren't entirelyfully understood or explained away by the CPU bottleneck.

Obviously he was never going to say the truth at the time, that the game was a fucking mess and that they were planning to release it in that state (lol). This also explains the sneaky language used in Ubisoft's official PR statement.
 
It's a shame Eurogamer's style guide prevents Digital Foundry from using phrases like "absolute, complete and utter fucking dogshit" and "fucking Jesus Christ in Heaven Goddammit all to fucking shit this is terrible", because the article doesn't truly convey the real experience of playing this game. The shittiness of the framerate really can't be overstated.

You literally cannot adjust the camera without making the framerate drop. The only time I've ever seen it hit 30FPS is when I stand in a corner and stare at a wall. There are so few frames a second it feels like you can actually press a button between frames and have the game fail to register your input. Frame drops in cutscenes. 20 second load times for deaths and 30 seconds for 'Fast' Travel. When you go into a large crowd, the game doesn't drop to 20FPS, because that would imply that it goes back up before long. This game simply runs at 20FPS when you're in a crowd. That's the framerate. 20FPS. Point the camera away again and it might climb up to a dizzying 25ish, but do anything in a crowd and it's just 20, all day long. Or lower.

I gave up. Deleted yesterday. Hopefully they patch it, but with GTA V, DA:I and Destiny expansion, I'm not sure if I care.
 
The truly sad thing is If you had asked me a year ago, I would have said that the frame rate issues that plagued assassins creed games would be gone with the new platforms, and yet, here we are. As least they got BF right I guess.

Ubisoft son, I am disappoint.

I had the same experience. The older games ran like ass on the PS3 and I really figured that they'd have it sorted out for the new machines. They, quite obviously, didn't.

Hah hahah how naive you can be people. Of course this generation is going to be like the previous one: increase graphical quality up to max, even at the cost of resolution and framerate. The only difference is that before the resolution fell down not reaching the standard of that time, 720p; now it's falling down the standard of 1080p.
 
It's a shame Eurogamer's style guide prevents Digital Foundry from using phrases like "absolute, complete and utter fucking dogshit" and "fucking Jesus Christ in Heaven Goddammit all to fucking shit this is terrible", because the article doesn't truly convey the real experience of playing this game. The shittiness of the framerate really can't be overstated.

I'm 16 hours into the Xbox One version and the framerate is not an issue. At all. So yeah, it can be overstated, you just did.

The only situations I found where the framerate was really bad was in some viewpoint synchronizations (where it is not a problem since you are not controlling the character anyway). Other than that it is perfectly fine and has never caused any control issues for me.

That being said, those videos in the Digital Foundry article show how in the PS4 the framerate can certainly be an issue. I assume you are playing on PS4?
 
In less strenuous areas, frame-rates often hit a 30fps lock on the Xbox One, although the same experience doesn't translate over to the PS4 all the time, which still drops frames more often. Since we aren't dealing with scenes crammed with NPCs in these instances, it's hard to see what exactly is causing the PS4 to be hit so hard - the difference cannot be explained by the 10 per cent boost in CPU clock-speed on the Xbox One, perhaps suggesting that the underlying netcode issues are more to blame here.

Just say it how it is: Lazy devs.
 
So the parity talk was real after all, no differences whatsoever between Xbox One and PS4. They're identical graphics wise.

Pretty disappointing.

They very clearly said this game would be 900p, 30FPS on both consoles. It's 900p, to be sure, but it sure as hell isn't 30FPS.

That being said, those videos in the Digital Foundry article show how in the PS4 the framerate can certainly be an issue. I assume you are playing on PS4?

I am, and this is one of the worst-performing games I've ever played. I don't feel like I'm overstating things at all. A game that averages closer to 20FPS than 30 is not 'perfectly fine' in any way whatsoever.
 
I like the game that's underneath all of the BS, but I'm going to put ACU on the backburner in hopes that they resolve some of the technical issues. So many other games to play right now anyway.
 
How is shit like this acceptable in a final product?

efnnl05.png

It gets bad. I was hanging from a building and it became a slideshow for no reason. Saw it happen four other times as well. Quality is vacant from this title. This will be the last Ubi game I buy on a whim. I had no issues with Black Flag which clouded my judgment.
 
Still looks better than 1080p with either of the hardware aa methods as the df comparison video shows. Its unreal how bad msaa/txaa look

I agree with TXAA. Its even worse than FXAA. But i love MSAA. MSAA is much sharper and im more than willing to accept some more shimmering edges as a trade off.

The only downside of MSAA is it cant tackle shader aliasing. SSAA could but none is using it.

i think the best trade off for consoles is SMAA T2x. It isnt as blurry as FXAA and is doing a good/better job against the jaggies.
 
Meh - performance gripes aside, I've been enjoying the hell out of this on my PS4. Not saying that Ubi should be excused for releasing an under-baked game, but some of the reactions from people are hilarious.
 
I'm 16 hours into the Xbox One version and the framerate is not an issue. At all. So yeah, it can be overstated, you just did.

The only situations I found where the framerate was really bad was in some viewpoint synchronizations (where it is not a problem since you are not controlling the character anyway). Other than that it is perfectly fine and has never caused any control issues for me.

That being said, those videos in the Digital Foundry article show how in the PS4 the framerate can certainly be an issue. I assume you are playing on PS4?

Are you really pulling the "20 fps is fine for me so it's not an issue" card?
 
I agree with TXAA. Its even worse than FXAA. But i love MSAA. MSAA is much sharper and im more than willing to accept some more shimmering edges as a trade off.

The only downside of MSAA is it cant tackle shader aliasing. SSAA could but none is using it.

i think the best trade off for consoles is SMAA T2x. It isnt as blurry as FXAA and is doing a good/better job against the jaggies.

just want to specify that im referring to the implementations in this game specifically, not in general.
 
Top Bottom