• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unofficial response from Assassin's Creed dev on 900p drama. Bombcast 10/14/2014

KidJr

Member
I bet this has been answered more than once actually. Anyway: PS4 has about 40% more GPU power and also more memory bandwidth/no hard size limit due to ESRAM. 1080p has 44% more pixels than 900p. Even if it'd scale 1:1 with resolution (in reality it's usually a little less than that) the PS4 would manage 1080p at about the same frame rate. Furthermore the GPU architectures are identical, so most optimizations done for the Xbox One version in this regard should directly translate over to the PS4 version. My opinion: 1080p should be possible on PS4 without all that much of extra effort. Alternatively they could stay at the same resolution but use the extra power for higher res shadow maps or the like. Or maybe the Xbox One doesn't run at a stable 30 fps in GPU bound situations, but the PS4 version could.

Now 900p at 60 fps would be a entirely different matter. You'd need twice of everything, including CPU power. Only possible on PC.



We don't (though nothing hints at it so far). We'll have to wait and see.

Umm thats not true because the optimizations you would make for the Xbone would be to take advantages of the ESRAM, unless your talking about optimizing on the CPU in which case there is little to no difference in performance. You dont have to account or utilise the speed of the ESRAM on the ps4 to make up for lack of bandwidth, like getting your data read from point a to point b isnt an issue in the slightest. Nor do you need think about splitting your data into chunks nor do you need to think about what exactly your going to pass through the ESRAM.

Different use cases, nothing funny about that. Crowd AI versus clothes, ie gameplay vs non-gameplay code. GPGPU's great strength is its parallelism, but it comes with a great limitation: your CPU and GPU act as separate systems and getting results back from the GPU to the CPU potentially costs a lot of performance and is really tricky to optimize.

That's why the stuff that's good to offload to GPGPU is visuals and fluff. Particles, clothes, illumination, all those are fine because you can keep them in a black box, both computing them and rendering them on the GPU, because they don't factor into gameplay. But anything the CPU is gonna need to have an active look at is gonna cost you just to get the information back. So in many of those cases you'll fine you get better performance being GPU-bound instead of waiting for the GPU to deliver.

Yeah I remember the Infamous guys talking about this with how you spend so much time/resources syncing the data... may I ask why if your able to read from the same memory locations why the f*ck do you need to spend so time syncing the data... what the point of even reading from the same location?

Someone in this thread posted that this gen spec isnt capable of running open world games at 1080p 60fps. I would be willing to bet my all of my mortages that you could on the ps4. I dont think people understand how big of a deal GPU compute is or what is and isnt good at. While I'm not saying the Ubisoft dev is wrong for saying the game is CPU bound (how the hell can anyone disagree with his statement without seeing the game code is beyond me) I put it down to time constraints, efficiency and parity.
 
Umm thats not true because the optimizations you would make for the Xbone would be to take advantages of the ESRAM, unless your talking about optimizing on the CPU in which case there is little to no difference in performance. You dont have to account or utilise the speed of the ESRAM on the ps4 to make up for lack of bandwidth, like getting your data read from point a to point b isnt an issue in the slightest. Nor do you need think about splitting your data into chunks nor do you need to think about what exactly your going to pass through the ESRAM.

Any ESRAM related optimizations would simply be unneeded on PS4, but that is hardly the only kind of GPU optimization you have to do. In any case you can't solve every problem at the same time with ESRAM. In many cases you just consume three times the bandwidth copying data that could have been read directly from DDR3, or you lose the ability to exploit the ESRAM bandwidth for other, more pressing tasks.
 

c0de

Member
Would'nt it be more surprising if they don't use it ?

It would be surprising if they had run tests on the high bandwidth and very low latency ESRAM with optimized code for the special architecture and wouldn't mention it with a single word on their slides.
 

RVinP

Unconfirmed Member
In no circumstance above does going from 900p on the 260x to 1080p on the 265 result in a massive frame rate drop

On an Intel i7 which doesn't share the same memory bandwidth between GPU<-to->Vram, no there won't be a massive frame rate drop.
 

DOWN

Banned
It's a bit disappointing when Ubi and posts here mistake the controversy for being about resolution instead of parity. We want to know what happened to the extra PS4 power.
 

gngf123

Member
The difference in power is probably linked to optimisation, in which PS4 takes less time to optimise to 900p/30fps.

Something like this:

iJKe5880sNtUe.png

Great graph man, you could work for Polygon with something like that.
 

Ahzrei

Banned
I'd be more inclined to believe this guy if the email didn't turn into an ad for the game part way through.


Either way, the game will be what the game will be. I haven't bought any AC games since III, and I don't plan on buying Unity unless the reviews are good and there's no annoying DRM on the PC version. I find this unlikely.
 

gngf123

Member
Why are you guys giving him shit for that graph? Lower is better in that case, he's not trying to suggest the X1 is better.

I didn't really say anything about his position or mine, what he says about developers possibly hitting target framerates quicker on PS4 than on X1 is true in some cases.

Just a no-data unlabelled graph is probably the worst way that it could be explained, and mostly unnecessary to begin with.
 

Marlenus

Member
On an Intel i7 which doesn't share the same memory bandwidth between GPU<-to->Vram, no there won't be a massive frame rate drop.

Irrelevant.

My methodology is rather simple. Take the largest performance factor from the consoles, the GPU, find the closest PC counterparts that do not suffer from Vram bottlenecking (so the 260x which is > the Xbox One GPU and the 265 which is ~= the PS4 GPU) and see what happens if you compare 900p on the 260x to 1080p on the 265. What we have found is that where the 260x can maintain a playable framerate at 900p the 265 can do the same at 1080p in all of the tests apart from batman where the 265 is 2x faster at 1080p than the 260x is at 900p.

If you now want to bring memory bandwidth into the equation we know that the 265 is not memory bandwidth limited as the 7870Ghz which has less bandwidth (153.6GB/s on the 7870 vs 179.2GB/s on the 265) performs better than the 265 in all instances. With the PS4s 176GB/s of memory bandwidth for GPU and CPU at minimum it will have 156GB/s of bandwidth available to the GPU if the CPU is using all of its 20GB/s allocation. This shows that bandwidth will not be a factor for the PS4 however the same cannot be said for Xbox One so it could very well be more limited leading to lower than expected performance where bandwidth is at a premium.

In conclusion, as has been stated many times, what the Xbox One can do at 900p the PS4 can do at 1080p.
 
I guess it's never occurred to either of you that people can be goofballs on their entertainment podcast, and still have their shit together professionally.
Not particularly. I like the guys but they are never my go to source on actual, real information on what's going on in the video game industry. Even Patrick Klepek, arguably the one with the best journalistic reputation, gets stuff wrong and sticks to it ('I heard Sony was doing DRM too, they just haven't announced it yet!')

They read anonymous dev emails all the time with little or no vetting, I take what they have to say with a grain of salt. That said they're very entertaining.
 

KidJr

Member
Any ESRAM related optimizations would simply be unneeded on PS4, but that is hardly the only kind of GPU optimization you have to do. In any case you can't solve every problem at the same time with ESRAM. In many cases you just consume three times the bandwidth copying data that could have been read directly from DDR3, or you lose the ability to exploit the ESRAM bandwidth for other, more pressing tasks.

Yeah thats kind of what I was saying because I believe he stated the same optimizations you'd use for xbone engine would also be beneficial to the ps4, they dont translate over because like you said they're unneeded, unless I have misunderstood.
 
Secondly I love how they talk about their pre baked lighting like its this amazing thing and they have loads of data on the Blu-Ray for their lighting system. Sorry it is a pre-baked system, seriously with the amount of compute performance you have on tap I expect more games to take the drive club route and do fully dynamic GI.

Even Drive Club doesn't have dynamic global illumination. It has dynamic environment lighting and reflections -- but not GI.
 

driver116

Member
I didn't really say anything about his position or mine, what he says about developers possibly hitting target framerates quicker on PS4 than on X1 is true in some cases.

Just a no-data unlabelled graph is probably the worst way that it could be explained, and mostly unnecessary to begin with.

I wasn't trying to pass it off as fact. Just an opinion. The graph was a simple visualisation of what I was trying to say as I have a hard time trying to say what I think. It was like a bully train you jumped onto.
 

KingSnake

The Birthday Skeleton
With so much drama one would say that Unity will bomb on PS4 because of the parity. Which it won't.

Looking forward to read the comments from common posters from here and the OT.
 

Dehnus

Member
Sigh, I am so tired of this.

Ubisoft, please just gimp the XBone version at 720P so we can stop this discussion.

Then Finally we can talk about things that matter again, like the Wii U and Captain Toad! :). *Huggles his Wii U*
 
With so much drama one would say that Unity will bomb on PS4 because of the parity. Which it won't.

Looking forward to read the comments from common posters from here and the OT.
I already dissuaded 3 of my friends from buying, if they tell three friends, then their friends tell...boom pyramid boycott.
 

Jonm1010

Banned
With so much drama one would say that Unity will bomb on PS4 because of the parity. Which it won't.

Looking forward to read the comments from common posters from here and the OT.

You sound almost proud that this will likely happen?

To me it's the equivalent of gettimg giddy at the prospect consumers are stupid enough that the always-online backlash won't affect sales.

The only way to force change of policy at these companies is through sales and in some cases with enough bad PR that the companies think it's not in their best interest to continue a policy for fear of long term repercussions. The longer this topic stays relevant the better chance that has of happening.
 

Monarch

Banned
This thread.
The amount of disrespect for developers without having the slightest idea of how game development works is just unbelievable.
 
  • It was because of parity and stuff...no...no...no...wait
  • It was because of the CPU requirements of the AI...no...no...no...wait
  • It was because of the CPU requirements of the prebaked global illumination...Yea that's the ticket

We'll know for sure if their latest explanation is likely true with the PC release. If the achievable resolution scales with CPU power then UbiSoft's explanation becomes more believable. If on the other hand the resolution more closely is tied to the GPU's power, then UbiSoft will have been caught bold face lying to its gaming public.

However even with this explanation there is one thing that cannot be disputed, UbiSoft left a good portion of the PS4's GPU unused.
 

KingSnake

The Birthday Skeleton
You sound almost proud that this will likely happen?

To me it's the equivalent of gettimg giddy at the prospect consumers are stupid enough that the always-online backlash won't affect sales.

The only way to force change of policy at these companies is through sales and in some cases with enough bad PR that the companies think it's not in their best interest to continue a policy for fear of long term repercussions. The longer this topic stays relevant the better chance that has of happening.

Proud? Proud of what? How can you be proud of something not related to you?
 

Handy Fake

Member
This thread.
The amount of disrespect for developers without having the slightest idea of how game development works is just unbelievable.

You could say that the amount of disrespect they've had for their audience is just as audacious and kicked the whole thing off.
 
You could say that the amount of disrespect they've had for their audience is just as audacious and kicked the whole thing off.
That's right, if it hadn't been for Vincent Pontbriand talking about parity 'to avoid debates and stuff,' this whole thing never would have been nearly the debacle it became.

By the way, when Watchdogs was announced to be 900p on PS4, we only had a 36 page thread about it (100pp of course):

http://m.neogaf.com/showthread.php?t=818182&page=1

It's amazing what a difference one stupid statement can make.

Edit: forgot it was 792 on Xbone
 

RowdyReverb

Member
That's right, if it hadn't been for Vincent Pontbriand talking about parity 'to avoid debates and stuff,' this whole thing never would have been nearly the debacle it became.

By the way, when Watchdogs was announced to be 900p on PS4, we only had a 36 page thread about it (100pp of course):

http://m.neogaf.com/showthread.php?t=818182&page=1

It's amazing what a difference one stupid statement can make.

Edit: forgot it was 792 on Xbone

Reminds me- I wonder if AC:U was orignally going to target 792p on XB1 as well, but MS said it was 'unacceptable'
 
I guess it's never occurred to either of you that people can be goofballs on their entertainment podcast, and still have their shit together professionally.

No one should take what they read in the email section of the podcast too seriously, is known to be a bullshit zone.

For example, they read one email a few months ago from an “Epic employee” saying that Fortnite was canceled.
 

Skilletor

Member
This thread.
The amount of disrespect for developers without having the slightest idea of how game development works is just unbelievable.

I'm a customer. They should be trying to earn my money instead of doing things to keep it in my wallet. I couldn't care less about the development process. Means jack to me. What matters are the standards set on the platform and the fact that they are arbitrarily limiting one to avoid debates and stuff.

The publisher should have some respect for the consumers.
 

c0de

Member
Reminds me- I wonder if AC:U was orignally going to target 792p on XB1 as well, but MS said it was 'unacceptable'

This really bothers you? Why? The game runs 900p on Xbox One. If someone helped or not, does it change anything? Btw, "unaccaptable" is not a magical switch that gives you more computing resources in any way.
 

Krisprolls

Banned
My opinion hasn't changed. If my platform (PS4) is not used at its best (which means it's 1080p and you see a meaningful difference with the 30 % weaker competitor), I won't buy the game.

So no AC Unity for me. Money goes to Dragon Age since they acknowledged they would optimize the game for PS4 (they stated resolution will definitely be superior on PS4).

When we stop buying this "forced parity" games, they will change their mind. If they don't care about us, why should we care about their games ? I don't know.
 

DeweyChat

Banned
We can't tell anything since we didn't see the game running on console.
The only portion we saw was on Xbox One and the game ran poorly at 900p with drops in the 20fps.

If the ps4 version at 900p is locked to 30fps with few effects in better quality (textures, shadow resolution, pop in) the debate about resolution worth nothing !
 

Monarch

Banned
I'm a customer. They should be trying to earn my money instead of doing things to keep it in my wallet. I couldn't care less about the development process. Means jack to me. What matters are the standards set on the platform and the fact that they are arbitrarily limiting one to avoid debates and stuff.

The publisher should have some respect for the consumers.

More often than not, PR doesn't translates well what the reality of development really is, especially in the field of programmation because you know ... that's PR.
You could say all you want but at the end of the day, you're not part of one the numerous studios working on AC, nor do you know how their engine scales between PS4 and Xbone.
What I see here is a sort of hatred for the numerous programmers working on the game when in reality none of you know jack shit on the product itself.
What if the production decided to stop optimisations on the PS4 @900p30FPS earlier on because of time ? Because a 1080p resolution couldn't be achieved with smooth framerate in time for the release ? Or what if the Xbone version was lagging behind and efforts had to be made to compensate?
The thing is, you wouldn't know all of this. Don't confuse PR and the development team.
 
More often than not, PR relation doesn't translates well what the reality of development really is, especially in the field of programmation because you know ... that's PR.
You could say all you want but at the end of the day, you're not part of one the numerous studios working on AC, nor do you know how their engine scales between PS4 and Xbone.
What I see here is a sort of hatred for the numerous programmers working on the game when in reality none of you know jack shit on the product itself.
What if the production decided to stop optimisations on the PS4 @900p30FPS earlier on because of time ? Because a 1080p resolution couldn't be achieved with smooth framerate in time for the release ? Or what if the Xbone version was lagging behind and efforts had to be made to compensate?
The thing is, you wouldn't know all of this. Don't confuse PR and the development team.

Uhh, I haven't seen any bashing on anyone in particular, definitely not in regards to programming, don't see where that's coming from. In addition, when people use the term "developer", they mean the whole kit-and-caboodle, the developers, the publisher, the PR, everything. This is because with the modern complete lack of transparency with almost every studio there's no way to know where the influence of one ends and the other begins. If we were forced to specify exactly who was at fault for something and in what way, it would literally be impossible because we're not psychic. So, the remaining options are not talking about issues at all ("avoiding debates"), or being more general in our accusations. When something clearly fishy is going on, I would certainly prefer the latter situation.
 

Skilletor

Member
More often than not, PR relation doesn't translates well what the reality of development really is, especially in the field of programmation because you know ... that's PR.
You could say all you want but at the end of the day, you're not part of one the numerous studios working on AC, nor do you know how their engine scales between PS4 and Xbone.
What I see here is a sort of hatred for the numerous programmers working on the game when in reality none of you know jack shit on the product itself.
What if the production decided to stop optimisations on the PS4 @900p30FPS earlier on because of time ? Because a 1080p resolution couldn't be achieved with smooth framerate in time for the release ? Or what if the Xbone version was lagging behind and efforts had to be made to compensate?
The thing is, you wouldn't know all of this. Don't confuse PR and the development team.

All I know is that the only games running on PS4/XB1 at the same resolution are the ones at 1080p. EVERY OTHER TITLE that is running sub 1080p on XB1 is running at a higher resolution (and, I'm pretty sure with comparable or better performance) on PS4.

So, no, I don't buy that they can't get this game running at a higher resolution on PS4. I'm sure the programmers are working hard. I work hard to earn the money I use to buy games and everything Ubisoft is showing and saying tells me to spend my hard earned money elsewhere.
 

bumpkin

Member
...well, I'm still getting the game (on XB1) even if it's "only 900p". Everything I've seen and heard about the game is fantastic. If people want to boycott the game because of resolution, fine, let 'em. That's their perogative. Although I don't work in game development, I do work in software development, and I can attest to how much of a bitch optimization can be. Sometimes you do the simplest thing and get huge gains, but eventually you get to a point where there is only so much you can do; the gains get smaller and smaller.

Game companies must really miss the days before the Internet when we got trickles of info and pictures in print magazines once a month. For all of the wonderful things we have these days with the web and "social media" and all that jazz, the unfortunate side is it gives the vocal minority a megaphone.

Actually, it's more like the oblivious neighbor's damn car alarm going off at 3am in the morning.
 
Top Bottom