• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unofficial response from Assassin's Creed dev on 900p drama. Bombcast 10/14/2014

This thread.
The amount of disrespect for developers without having the slightest idea of how game development works is just unbelievable.

I have no idea how the development works hence I won't insult Ubi for making the only two 900p/30fps games on PS4, but they're not getting my money that's for sure.
 

QaaQer

Member
I'd be more inclined to believe this guy if the email didn't turn into an ad for the game part way through.


Either way, the game will be what the game will be. I haven't bought any AC games since III, and I don't plan on buying Unity unless the reviews are good and there's no annoying DRM on the PC version. I find this unlikely.

It might still be true even if it was authorized and written by ubi PR. I do think bombcast got played by ubisoft marketing by reading that advertisement.
 

Flo_Evans

Member
BzYwGQX.png


Here, now it's really worth of Polygon

ROFL.
 

RoboPlato

I'd be in the dick
The difference in power is probably linked to optimisation, in which PS4 takes less time to optimise to 900p/30fps.

Something like this:

iJKe5880sNtUe.png

I know I'm a little late but this post has made me laugh harder than anything else I've seen in a tech thread in a long time. Holy crap
 

KiraXD

Member
The problem is parity.

Ubisoft are fucking hacks if they want us to swallow a gimped version of the game on PS4 just because they want it to run the same as XB1. If you make a game for consoles... make it the best fucking game on each console you can.


Dont they want the best possible game they can achieve? They are effectively saying that a subpar game is okay as long as its subpar on all consoles equally.

Its mindblowing that a developer is okay putting out a game with substandard features to avoid fanboy rage...

(effectively causing even MORE rage among gamers... Ubisoft needs to take a step back and re-evaluate their strategy)
 

RoboPlato

I'd be in the dick
This thread.
The amount of disrespect for developers without having the slightest idea of how game development works is just unbelievable.

This is a random email from an anonymous source that doesn't give much of an explanation that solves the question we've been asking. We don't even know if it's legitimate. Considering Ubi's constant disrespect of the audience in replying to this issue, I think skepticism is warranted. If an actual member of the dev team came out and gave a detailed tech explanation without going on a rant about how shitty their audience is and taking a dig at other games, then more respect would be warranted.
 

RowdyReverb

Member
This really bothers you? Why? The game runs 900p on Xbox One. If someone helped or not, does it change anything? Btw, "unaccaptable" is not a magical switch that gives you more computing resources in any way.

No, it doesn't bother me. My point is that the PS4 version could have been targeting 900p all along like Watch_Dogs, but the XB1 version managed to squeeze out 900p as well, possibly with help from MS.
 

JordanN

Banned
You are aware how much power this actually takes? There´s a reason we are seeing a lot of baked lighting and will continue to. The PS4 is not a powerhouse. 1.8 TF GPU isn´t much when Unreal 4 originally required 2.5 at the bare minimum to enable its global illumination system. Add to that the fact that the PS4 GPU is driven by a CPU that is a joke compared to anything on the PC side of the fence. What people seem to have trouble accepting is the fact that these are both fairly weak consoles for a 1080P frame buffer especially if you are trying to pull off dynamic lighting. DriveClub is the exception and does it because it´s a linear racer. And apparently it sacrifices texture detail (no AF) and IQ (lots of aliasing) among other things. If the PS4 can´t run this game at 1080P on baked lighting you can bet there´s no chance in hell they could ever pull it off on a GI system...

The days when global illumination is the expected norm isn´t this generation yet. That´s what the next generation of consoles is going to be about among other things. I get that we are used to seeing dramatic performance increases between generations, but the industry has changed. Noone is selling a console at a loss anymore for years in hope of recovering their money 3-4 years later. It´s just not good business.

That GI is going to be replaced soon. It was far too power hungry when Epic first announced it. GI is possible on PS4 depending on your approach.
 
I literally don't know what you're talking about. He was willing to provide proof to Giantbomb that he was legit. They're not some hardboiled news source but they still have some standards so they're not just reading industry gossip every week.

Also why didn't it leak anywhere else before? Because things don't leak until they leak. Where do you think they should leak first?

I'm not really picking sides here. I'm just saying that to me it seemed fishy because that's not how GiantBomb "developer" emails usually go. The fact that he is willing to provide proof might be convincing to most people, but for me it's weird because I don't remember a single "developer" email on GiantBomb that said this. And as far as I'm concerned it is a weird thing to say unless they act on it...

In fact they should have followed up and turned into a news story. Would have gotten them clicks and would be more convincing to me. (Basically they should have acted as journalists :))
 
The problem is parity.

Ubisoft are fucking hacks if they want us to swallow a gimped version of the game on PS4 just because they want it to run the same as XB1. If you make a game for consoles... make it the best fucking game on each console you can.


Dont they want the best possible game they can achieve? They are effectively saying that a subpar game is okay as long as its subpar on all consoles equally.

Its mindblowing that a developer is okay putting out a game with substandard features to avoid fanboy rage...

(effectively causing even MORE rage among gamers... Ubisoft needs to take a step back and re-evaluate their strategy)
Yes -- ultimately AC: Unity isn't competing with itself, it's competing with rival products in the marketplace.

If the game could have been better on PS4, then it will suffer in comparison to other PS4 games and thus in terms of sales.

Creating artificial disparity may alter the ratio of sales between PS4 and X1 somewhat, but that does Ubisoft no particular good, especially if the total number of copies sold decreases as a result.
 

QaaQer

Member
Yes -- ultimately AC: Unity isn't competing with itself, it's competing with rival products in the marketplace.

If the game could have been better on PS4, then it will suffer in comparison to other PS4 games and thus in terms of sales.

Creating artificial disparity may alter the ratio of sales between PS4 and X1 somewhat, but that does Ubisoft no particular good, especially if the total number of copies sold decreases as a result.

I guess it depends what the deal with ms is worth. Ms does throw piles of money around (Minecraft/tomb raider).
 
I guess that pretty damn explains everything.

Except it's from an anonymous source with no credibility or foundation of evidence. It seems like anyone can just make up anything to stir the bee hive that is NeoGaf and cause quite a stir.

Maybe we should wait until credible evidence comes out before frothing at the mouth and yelling promises to not purchase Assassin's Creed Unity?
 

Marlenus

Member
If the ps4 version at 900p is locked to 30fps with few effects in better quality (textures, shadow resolution, pop in) the debate about resolution worth nothing !

Three things with this.

Firstly. Ubi said that it was CPU limited, some of those effects require more CPU resources to increase so if what Ubi said is true (I do not believe them) then they should not be able to increase these effects.

Secondly. Going from the first point if they are genuinely CPU limited then increasing resolution can improve the IQ without impacting on the CPU provided the GPU can handle the extra workload. Based on PC benchmarks I am almost certain the PS4 GPU could handle an increase from 900p to 1080p

Thirdly. As I do not believe that they cannot achieve 1080p on PS4 using the extra power to increase effects instead of increasing resolution is sub optimal. 900p to 1080p is already quite hard to notice in some instances but trying to notice the difference between High and Very High shadows, or High and Very High textures is even harder. Increasing the resolution is gives you a bigger increase in graphical fidelity than increasing effects will.

There has been one reason that was put forward that can explain it although I am not convinced by it. It was mentioned by a user here and they said that perhaps they have run out of memory. Increasing resolution does increase memory requirements on the GPU so it is a technically valid scenario. Based on Watch Dogs going from 1080p to 4k results in a 1GB increase in VRAM use (50% more), obviously 900p to 1080p is nothing like that so I would expect a less than 200MB increase in memory usage from the higher resolution and I really doubt they are pushing memory use that much but it is possible.
 

Skilletor

Member
Except it's from an anonymous source with no credibility or foundation of evidence. It seems like anyone can just make up anything to stir the bee hive that is NeoGaf and cause quite a stir.

Maybe we should wait until credible evidence comes out before frothing at the mouth and yelling promises to not purchase Assassin's Creed Unity?

My evidence is every other multiplat game release on the systems.
 
My evidence is every other multiplat game release on the systems.

that's a cop out. I won't believe anything until there is credible sources. It's so easy to just create theories and drum up conspiracies based on anonymous sources. Sure, most of them end up being true in the end but it's still not valid to jump the gun.

I'll wait until this is proven true before making comments.
 
You´re in for a lot of last-gen games then... These consoles simply don´t have the GPU power to pull off dynamic lighting in an open world game without technical expertise close to wizardry. Just as an example remember the Unreal 4 engine had to be scaled back as neither console met its requirements for global illumination tech.

This generation of consoles just doesn´t have the technological jump consoles used to have between generations.

I didnt say anything about dynamic lighting. I just want a day-night cycle that advances while free roaming the world like a lot of open-world games from the last-gen had. Farcry, Assassins Creed, Dragons Dogma, GTA Red Dead Redemption to name a few.
I don’t see why current gen consoles couldn’t manage the same effects that consoles from 2005-2006 could handle.
 

Skilletor

Member
that's a cop out. I won't believe anything until there is credible sources. It's so easy to just create theories and drum up conspiracies based on anonymous sources. Sure, most of them end up being true in the end but it's still not valid to jump the gun.

I'll wait until this is proven true before making comments.

Believe what you want. For me, there's no more credible evidence than everything that's been released. That includes other games by the same developer.
 

danwarb

Member
How does that explain XB1 version being at 900p as well? Shouldn't it be lower?

Isn't resolution GPU related anyways?

Not if the difference is a few frames and the game is locked at/minimum 30fps.


They mention CPU concessions from MS. The X1 CPU is clocked a bit higher so maybe AssCreed is using more GPU compute on PS4, who knows? The conspiracy theories seem pathetic.
 

Nokterian

Member
"I'm happy to enlighten you guys because way too much bullshit about 1080p making a difference is being thrown around. If the game is as pretty and fun as ours will be, who cares? Getting this game to 900p was a BITCH. The game is so huge in terms of rendering that it took months to get it to 720p at 30fps. The game was 9fps 9 months ago. We only achieved 900p at 30fps weeks ago. The PS4 couldn't handle 1080p 30fps for our game, whatever people, or Sony and Microsoft say. Yes, we have a deal with Microsoft, and yes we don't want people fighting over it, but with all the recent concessions from Microsoft, backing out of CPU reservations not once, but twice, you're talking about a 1 or 2 fps difference between the two consoles. So yes, locking the framerate is a conscious decision to keep people bullshiting, but that doesn't seem to have worked in the end. Even if Ubi has deals, the dev team members are proud, and want the best performance out of every console out there. What's hard is not getting the game to render at this point, it's making everything else in the game work at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed optimization for lots of Ubisoft games in the past, this is crazily optimized for such a young generation of consoles. This really is about to define a next gen like no other game before. Mordor has next gen system and gameplay, but not graphics like Unity does. The proof comes in that game being cross gen. Our producer (Vincent) saying we're bound with AI by the CPU is right, but not entirely. Consider this, they started this game so early for next gen, MS and Sony wanted to push graphics first, so that's what we did. I believe 50% of the CPU is dedicated to helping the rendering by processing pre-packaged information, and in our case, much like Unreal 4, baked global illumination lighting. The result is amazing graphically, the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others. Because of this I think the build is a full 50gigs, filling the bluray to the edge, and nearly half of that is lighting data."

Ok let me comply with these gifs what i feel at the moment when reading this.

Joaquin-Phoenix-signs.gif


11514%20-%20animated_gif%20charlton_heston%20laughing%20planet_of_the_apes%20reaction_image.gif


Keep it classy ubisoft my god...really unbelievable that this shitstorm is going further each week! Jim sterling needs to hold him self but i doubt that will happen :p
 

omonimo

Banned
Not if the difference is a few frames and the game is locked at/minimum 30fps.


They mention CPU concessions from MS. The X1 CPU is clocked a bit higher so maybe AssCreed is using more GPU compute on PS4, who knows? The conspiracy theories seem pathetic.
What do you said not has sense. They can't handle CPU things in the GPU.
 

Portugeezer

Member
Looks like this debate continues to go around in circles. I feel like I am reading the same shit from every other AC Parity thread.
 

Mastperf

Member
Not if the difference is a few frames and the game is locked at/minimum 30fps.


They mention CPU concessions from MS. The X1 CPU is clocked a bit higher so maybe AssCreed is using more GPU compute on PS4, who knows? The conspiracy theories seem pathetic.

I think Matt confirmed the other day the PS4 cpu is still faster in real-world use even after MS apparently reduced the overhead. I might have misunderstood him though.
 

vrln

Neo Member
I didnt say anything about dynamic lighting. I just want a day-night cycle that advances while free roaming the world like a lot of open-world games from the last-gen had. Farcry, Assassins Creed, Dragons Dogma, GTA Red Dead Redemption to name a few.
I don’t see why current gen consoles couldn’t manage the same effects that consoles from 2005-2006 could handle.

Ah, ok. My bad. I thought you meant a day-night cycle that´s done in the optimal way for maximum lighting realism i.e. via global illumination like in Drive Club. That unfortunately will be a rare thing this generation.

My guess is that due to the higher quality assets in contemporary console games an old fashioned day/night cycle would look too crummy for marketing. Graphics sells and you get better looking trailers using baked lighting that is specific to a certain time of the day.
 
Except it's from an anonymous source with no credibility or foundation of evidence. It seems like anyone can just make up anything to stir the bee hive that is NeoGaf and cause quite a stir.

Maybe we should wait until credible evidence comes out before frothing at the mouth and yelling promises to not purchase Assassin's Creed Unity?

There's no simple explanation for this game that explains the 900p parity without adding in Microsoft.

"Our game is literally the only 900p PS4 game on the market that runs at the same resolution as the Xbox One game, and that's because.....?"

Every single response, official and unofficial, has completely eschewed the actual issue at hand, handling the vast GPU power difference. Why obscure the response?
 
The problem is parity.

Ubisoft are fucking hacks if they want us to swallow a gimped version of the game on PS4 just because they want it to run the same as XB1. If you make a game for consoles... make it the best fucking game on each console you can.


Dont they want the best possible game they can achieve? They are effectively saying that a subpar game is okay as long as its subpar on all consoles equally.

Its mindblowing that a developer is okay putting out a game with substandard features to avoid fanboy rage...

(effectively causing even MORE rage among gamers... Ubisoft needs to take a step back and re-evaluate their strategy)

Though developers have been releasing games for years that did not make the most use out of the system. Hell CD Project Red besides the resolution difference of 1080p for PS4 and 900p for X1 said the Witcher 3 will be nearly identical each other, even not using some of technology the PS4 has to achieve this.

Let's go back to gen 5 era where the xbox and gamecube were more powerful than the PS2, however many multiplatform games early didn't really take full capacity of the xbox's power. Developers are going to take the steps to achieve what is easiest to get the same experience across all platforms. There will be many more games that won't use the full potential of a certain system, it is not an isolated event to ubi or Ac: Unity

Besides I am 100% banking, that will all this rage Ubi will release a 1080p patch for the PS4 like the one Black Flag got.
 

omonimo

Banned
Though developers have been releasing games for years that did not make the most use out of the system. Hell CD Project Red besides the resolution difference of 1080p for PS4 and 900p for X1 said the Witcher 3 will be nearly identical each other, even not using some of technology the PS4 has to achieve this.

Let's go back to gen 5 era where the xbox and gamecube were more powerful than the PS2, however many multiplatform games early didn't really take full capacity of the xbox's power. Developers are going to take the steps to achieve what is easiest to get the same experience across all platforms. There will be many more games that won't use the full potential of a certain system, it is not an isolated event to ubi or Ac: Unity

Besides I am 100% banking, that will all this rage Ubi will release a 1080p patch for the PS4 like the one Black Flag got.
I knew The Witcher 3 could be 900p as well on ps4. I have missed some update?
 
That GI is going to be replaced soon. It was far too power hungry when Epic first announced it. GI is possible on PS4 depending on your approach.

The only game that does it right currently is Alien:Isolation. And you can see that this game doesn't have many characters or outdoors or even a lot of action and it still can't push over 30+FPS on PS4. I don't know of any other more optimal approaches that devs can use for an open-world game like AC:Unity.
 
Ah, ok. My bad. I thought you meant a day-night cycle that´s done in the optimal way for maximum lighting realism i.e. via global illumination like in Drive Club. That unfortunately will be a rare thing this generation.

My guess is that due to the higher quality assets in contemporary console games an old fashioned day/night cycle would look too crummy for marketing. Graphics sells and you get better looking trailers using baked lighting that is specific to a certain time of the day.

Would you guys stop saying that! Drive Club does NOT have global illumination!!
 

Mr Moose

Member
Would you guys stop saying that! Drive Club does NOT have global illumination!!

*Googles DriveClub Global illumination* Lots of websites say it does.

“It’s allowed us to create some of the biggest tracks you will ever see in a racing game, with the most visually-dense environments, as well as a full global illumination system which allows us to make time of day, where shadows are cast, full-on reflections. It’s allowed us to push all those various areas.”

http://www.vg247.com/2014/04/30/dri...lutely-the-best-thing-for-game-says-director/
 
I cannot wrap my mind around how they can say that 900p on ps4 is all its able to do and then say that xbox one can do the same. This literally doesnt make sense in my head. The GPU's are very very different from one another.
 

Head.spawn

Junior Member
Results are from TPU http://www.techpowerup.com/reviews/S..._Dual-X/6.html/

-------------------- 260X @ 900p ----- 265 @ 1080p
ACIV ----------- 25.7 ------------------- 27.7
Batman: AO - 32.2 ------------------- 66.5
Battlefield 3 -- 53.4 ------------------- 52.6
Battlefield 4 -- 35.7 ------------------- 35.2
BS: Infinite --- 68.0 ------------------- 61.1
COD:G --------- 64.3 ------------------- 60.8
COJ:Gunslinger 128.3 -------------------127.8
Crysis ---------- 43.3 ------------------- 41.8
Crysis 3 ------- 20.8 ------------------- 20.8
Diablo 3 ------- 88.2 ------------------- 104.9
Far Cry 3 ----- 26.7 ------------------- 25.4
Metro:LL ----- 38.3 ------------------- 35.2
Splinter Cell - 27.1 ------------------- 29.6
Tomb Raider - 27.0 ------------------- 23.6
WoW ----------- 66.9 ------------------- 74.9

In no circumstance above does going from 900p on the 260x to 1080p on the 265 result in a massive frame rate drop. Since this is a PC based benchmark we are comparing just the GPU differences and there is nothing else that will skew the results. The 260x is a bit above the Xbox One GPU in terms of performance so the results you see above are closer than what they would be if we had a closer match to the Xbox One GPU. The 265 is practically the same as the PS4 GPU.

I also need to be clear here and state that these results are not any kind of expected performance target for the consoles because PCs are different to the consoles. I am just saying that these are the relative GPU differences in a real world scenario.

In the console sphere the PS4 has less overhead than the Xbox One in terms of its OS and API leading to slightly increased CPU performance despite the slightly slower clock speed. When you add that to the above information it is pretty conclusive proof that what the XBox One can do at 900p the PS4 can do at 1080p.

It also highlights a scenario (Batman) where you get a much larger gap than expected, perhaps whatever causes this gap in Batman is the same as what causes the gap in Fox Engine based games. Looking at the specs of the GPUs used here I would put that gap down to either bandwidth (in which case perhaps Xbox One can get a speed bump in Fox Engine with better ESRAM usage) or the limited number of ROPS..

Would these numbers be the same if both PCs were paired with the same laughably underpowered CPU's and maybe the games were near 100% utilization of the CPU's?
 

JordanN

Banned
The only game that does it right currently is Alien:Isolation. And you can see that this game doesn't have many characters or outdoors or even a lot of action and it still can't push over 30+FPS on PS4. I don't know of any other more optimal approaches that devs can use for an open-world game like AC:Unity.
Dead Island 2 is open world and it currently uses GI (Light propagation volumes mixed with distance fields ambient occlusion). Realtime GI still has a long way to go so I'm not saying it's going to be perfect on PS4.

But there's been a lot of progress being made in it. You can find new whitepapers that were published after Epic showed SVOGI, so there's definitely hope for it. Even Epic themselves have updated their roadmap to include real time GI again.
 
Dead Island 2 is open world and it currently uses GI (Light propagation volumes mixed with distance fields ambient occlusion). Realtime GI still has a long way to go so I'm not saying it's going to be perfect on PS4.

But there's been a lot of progress being made in it. You can find new whitepapers that were published after Epic showed SVOGI, so there's definitely hope for it. Even Epic themselves have updated their roadmap to include real time GI again.

Light propagation on static scenes and geometry isn't really realtime GI in my book. The light probes are prebuilt when the scene is loaded. Crysis 3 does this. Its not a solution you can see in practice. As I said Alien is the only game that shows it happening in realtime and updates continuously. Light probes are built on the fly. Not with scene loading.

And yes UE4 does it right but no game out with that engine.
 

Sirim

Member
Probably gonna look real stupid once someone answers this question, but at what time-code is the Ubisoft email read out loud on the Bombcast?

I went to 2:55 as stated in the OP and I didn't hear it after a while listening, and also just in case I went to 2 hours and 55 seconds and it wasn't there either. And the video is only 2 hours 40 something minutes long, so there's no 2 hours 55 minutes to go to.


Misread OP number.

Edit: A little disappointed that in the podcast, the GB team (though they mentioned the infamous "debate" quote) entirely left it out of their discussion of how ridiculous it is to be upset at a game for having a 900p resolution. It's strange that they'd mention the quote, then not acknowledge it as the actual source of frustration amongst people, and not the resolution itself.
 

RoboPlato

I'd be in the dick
Would you guys stop saying that! Drive Club does NOT have global illumination!!
Yes it does. EDIT: NVM, just saw your other post

Weren't you the guy who tried to say that there was no way Shadow Fall was using area lights prerelease, only to be floored by them in game?
 

shootfast

Member
This is a random email from an anonymous source that doesn't give much of an explanation that solves the question we've been asking. We don't even know if it's legitimate. Considering Ubi's constant disrespect of the audience in replying to this issue, I think skepticism is warranted. If an actual member of the dev team came out and gave a detailed tech explanation without going on a rant about how shitty their audience is and taking a dig at other games, then more respect would be warranted.

The guy/girl offered to provide identification, brad never took him/her up on the offer.
 
Top Bottom