• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unofficial response from Assassin's Creed dev on 900p drama. Bombcast 10/14/2014

Getting this game to 900p was a BITCH. The game is so huge in terms of rendering that it took months to get it to 720p at 30fps. The game was 9fps 9 months ago. We only achieved 900p at 30fps weeks ago. The PS4 couldn't handle 1080p 30fps for our game


Sounds like an admission of technical incompetence.
 

Hubble

Member
All these responses, and we still don't know what's up with the resolution parity.

Yes, yes, we get it. You're CPU bound. You apparently use a lot of the CPU time to constantly load massive buffers for the GPU. You have ~25 gigs of baked GI.
None of this explains why the PS4 is not pushing more pixels. These things do not effect pixel fillrate significantly.

It could simply be because of time constraints while aiming for multiplatforms.
 

KoopaTheCasual

Junior Member
It could simply be because of time constraints while aiming for multiplatforms.
It would be awesome if they dedicated 10 seconds to copy and pasting your sentence into an official statement. It would actually make sense, and alleviate a bit of the hysteria.


But I guess Ubi gonna Ubi.
 

BibiMaghoo

Member
Keep fighting the good fight, Ubisoft. This kind of responses only serves to make me feel like I made the right call canceling Unity and buying Mordor instead. Which, by the way, is much more fun than Unity will probably ever be, considering the series' track record.

I have to ask, but do you normally pre-order games in a series you do not find especially fun?
 

ItsTheNew

I believe any game made before 1997 is "essentially cave man art."
Ubisoft, Let me help you out. Go on YouTube and show the ps4 version of assassins creed running at 900p. Then, (I'm sure this is easy to do) set the resolution at 1080p. If what you're saying is true it will run at a shitty framerate and everyone will be satisfied.
 
Ok lets take their word for.

the CPUs for both consoles are the same,

the Memory on PS4 is a tad faster

the GPU on PS4 is moderately faster.


Are they relying on the CPU 100% of the time for the graphics ? Does this mean if the CPU is handling the resolution, that the GPU cannot lead to better affects ?


If both games are 900p 30fps, would the PS4 version have better graphic effects inside that resolution or are you creating parity from the GPU.

That is the big question
 

Mindman

Member
I believe them this time. I really honestly do. He explains that both consoles had trouble reaching 900p, but PS4 could achieve a higher framerate at that resolution than XB1. They still decided to lock it at 30 for consistency. Good call, and perfectly explained IMO.
 

mcrommert

Banned
Should have just said CPU is bottlenecking the game...then it makes sense to have parity between Xbox One and PS4 (as they basically have the same cpu). Just say that and be done
 
eqyewy.gif

LOL. This is pretty fucking great.
 
People aren't complaining that the game is 900p/30fps. They are complaining that BOTH PS4 and Xbox One are 900p/30fps and haven't given a legitimate reason why outside of the reason, "to avoid debate".
 

Etnos

Banned
If the Xbone can run the exact same game in 900p, well then the PS4 can run it in 1080p.

Is not as simple as that, he stated that the CPU is processing the baked lights.. considering both systems are almost on par CPU wise It wouldn't surprise if that was the bottle neck for both consoles (in this particular case). As someone who worked on game development for a while, this is hard to come across but performance is not as simple as a numbers game. Lots of factors come into consideration specially engine design.

Now this being a multi-platform game they probably didn't design their engine exclusively to get the most out of a PS4 and you have the right to hate them for that, I guess...
 
Welp. Here comes the armchair developers.

Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.
 

KoopaTheCasual

Junior Member
Should have just said CPU is bottlenecking the game...then it makes sense to have parity between Xbox One and PS4 (as they basically have the same cpu). Just say that and be done
They've already said that. People countered that resolution is traditionally GPU limited, and having the same CPU doesn't have much to do with it. That's the current debate, anyways.
 
If the bottleneck is the CPU then this make a lot of sense. The component that is nearly the same in both the PS4 and the XB1 is the CPU. The extra GPU power and memory in the PS4 don't matter if they are not involved in the bottleneck. Sounds reasonable to me.

That's not how it works, though. Vertex data is fed from the CPU into the GPU. Then the GPU does goofy math over every point on the screen and renders out what thing it hit, over many times.

Okay, let's try an analogy.

Once the GPU has the scene data to work with, it renders the game at any resolution it wants. Just imagine it like an autistic kid, sitting behind a screen door looking out towards the street.

The CPU is the puppeteer, making the street. The porch. The cars driving by. Sets all of that up.

The GPU is the kid looking into each square in the screen door and saying what color he sees when he looks in it.

If it's a tighter screen door, with smaller holes, it doesn't affect how detailed the street is, just how many holes there are, and how long it takes to do it.

The PS4 kid can count 30 holes in the time it takes the XB1 kid to count 20 holes.

So no matter what happens, the PS4 kid can count more holes. And if they're limiting it to 900 holes because of the puppeteer in the street, that makes no fucking sense, does it?
 

kmax

Member
Just listening to this week's Bombcast and they got an email from an Assassin's Creed/Ubisoft developer. The dev stated they would be willing to provide further proof of their position/job if needed, but the Bombcast guys felt comfortable enough reading it on air after cross checking the claims made with another developer.

This begins at about 2:25 on this week's Bomcast.

I'm going to give as close a quote as I can.

"I'm happy to enlighten you guys because way too much bullshit about 1080p making a difference is being thrown around. If the game is as pretty and fun as ours will be, who cares? Getting this game to 900p was a BITCH. The game is so huge in terms of rendering that it took months to get it to 720p at 30fps. The game was 9fps 9 months ago. We only achieved 900p at 30fps weeks ago. The PS4 couldn't handle 1080p 30fps for our game, whatever people, or Sony and Microsoft say. Yes, we have a deal with Microsoft, and yes we don't want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you're talking about a 1 or 2 fps difference between the two consoles. So yes, locking the framerate is a conscious decision to keep people bullshiting, but that doesn't seem to have worked in the end. Even if Ubi has deals, the dev team members are proud, and want the best performance out of every console out there. What's hard is not getting the game to render at this point, it's making everything else in the game work at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed optimization for lots of Ubisoft games in the past, this is crazily optimized for such a young generation of consoles. This really is about to define a next gen like no other game before. Mordor has next gen system and gameplay, but not graphics like Unity does. The proof comes in that game being cross gen. Our producer (Vincent) saying we're bound with AI by the CPU is right, but not entirely. Consider this, they started this game so early for next gen, MS and Sony wanted to push graphics first, so that's what we did. I believe 50% of the CPU is dedicated to helping the rendering by processing pre-packaged information, and in our case, much like Unreal 4, baked global illumination lighting. The result is amazing graphically, the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others. Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data."

07-minister.jpg
 
Still trying to push the PS4 and X1 as being near equal when that's far from the case. Good thing I'm not buying this game. I mean hell, even if it were true, with the amount of trouble this guy's talking about, why would I buy a game that probably will end up with a lot of issues. I've bought every AC day 1, not anymore.

The PS4 is stronger, if the game doesn't show that, then it means it was purposely held behind and I won't be supporting that. Why should PS4 owners suffer due to MS' mistakes.
 

Kezen

Banned
The CPU part explains the framerate. Still, why is the resolution identical ?
Ubisoft should have never talked about any of that.
 
I believe them this time. I really honestly do. He explains that both consoles had trouble reaching 900p, but PS4 could achieve a higher framerate at that resolution than XB1. They still decided to lock it at 30 for consistency. Good call, and perfectly explained IMO.

Don't believe liars.
 

orochi91

Member
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.

Try being more creative with the damage control.
 
No, they're just sharing the realities of game development on new hardware. Explaining that it was no easy task to reach the 900p 30fps end point they did.
I understand that, but they could have maybe reduced effects or the number of people in the crowd to achieve a smoother performance. It may take lots of work to get to what they did, but some people prefer smoother gameplay than background effects and such.
They should have just stuck with parity and not try to defend the standpoint.

You missed the part about how the dev takes shots at Shadow of Mordor for not being next gen because it doesn't look as good as Unity.
Exactly, karma is coming hard. I rather like Shadow of Mordor because I prefer smoothness over background noise.
 

benny_a

extra source of jiggaflops
They've already said that. People countered that resolution is traditionally GPU limited, and having the same CPU doesn't have much to do with it. That's the current debate, anyways.
Ubisoft also has said that this new engine was designed for next-gen hardware.
The new hardware is the one with GPU compute that can take over quite a bit of CPU tasks according to their own presentation.

One of them is lying, it's either anonymous Ubisoft or public Ubisoft. ;-)

CPU on Xbox One is faster.
Not in any real world public tests nor according to any verified real world multiplatform developer.
 
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.

Hi.
 

Markitron

Is currently staging a hunger strike outside Gearbox HQ while trying to hate them to death
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.

Are you serious??
 
For reference the lighting in PS4's infamous is miles better than anything I have seen on Unity and that was near the launch window. Ubisoft has no excuse
 
Yuppp...
My favorite part is that there's not a single person on this forum who can identify whether any particular game is 1080 or 900 until digital foundry chimes in, yet it's the most important thing to talk about this gen. Anyhow, the email makes perfect sense and is exactly what I've been assuming since this whole controversy began.

Borderline bannable.
 

eot

Banned
Someone explain this to me. What exactly is the CPU is doing in this case? What is "pre-packaged information"?

Compressed lighting information. The engine needs to load it, decompress it and send it to the GPU. Decompressing data in real time can eat up a lot of CPU time, and if you have a massive game world with partially pre-baked lighting that will take a shitton of space. It's not like textures that you tile, it's unique data. I have no idea if 50% of the CPU time is a reasonable number, but it's possible they decided to used that prebaked GI and were stuck with weak CPUs and not much power left.
 

JayEH

Junior Member
You missed the part about how the dev takes shots at Shadow of Mordor for not being next gen because it doesn't look as good as Unity.
 
Top Bottom