• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unofficial response from Assassin's Creed dev on 900p drama. Bombcast 10/14/2014

gofreak

GAF's Bob Woodward
How is this different from other multiplatform games that have parity? Just because PS4 might be 900p?

The argument that the PS4 should be able to do more should also be valid for FIFA, Destiny etc. just because the xbox version is 1080p then we don't complain quite as much?

I think the complaint here stems from the original comment that parity was something that was enforced more or less for 'political' reasons - to avoid debate.

If a dev had said the same of any of those other games, we'd have the same complaints I think.

I think cross-gen games get more of a pass on 'parity' though, because there's the sense that both machines should be able to do a cross-gen game at native full HD or whatever.

Ubisoft has given ample explanation - officially or unofficially - about why the framerate is what it is or would be similar on both systems. But I still feel there's a gap in the explanation of why the resolution is what it is on both systems. But I probably wouldn't have thought twice about it had it not been for that comment about 'having parity to avoid debate'. It sounds like a very contrived situation.

(I can also accept the 'the dev process was complicated, we're on a schedule, we don't want to tempt fate by going further even if it might be possible on one system or the other' - but it's a somewhat different tune to the original explanation. Maybe the original explanation is what is because that dev didn't want to suggest Ubi couldn't do better with their dev process/budget/schedule? A pride thing? I don't know. But very ill-advised.)
 

UrbanRats

Member
That was a good email. The guy sounded so bitter.

Sounds like a lot of work has been put into making this thing even run at a decent framerate, so i can understand that.

It would be a decent answer, if it wasn't for the useless Mordor shit-slinging and the initial line about 'who cares about 1080p, if graphics are good enough?' (paraphrasing).

I always accepted the idea of compromise, i just want them to STOP trying to justify them with patronizing bullshit like "blurrier it's actually better for you".

This is assuming the email is legit and all that...
 

vrln

Neo Member
Secondly I love how they talk about their pre baked lighting like its this amazing thing and they have loads of data on the Blu-Ray for their lighting system. Sorry it is a pre-baked system, seriously with the amount of compute performance you have on tap I expect more games to take the drive club route and do fully dynamic GI.

You are aware how much power this actually takes? There´s a reason we are seeing a lot of baked lighting and will continue to. The PS4 is not a powerhouse. 1.8 TF GPU isn´t much when Unreal 4 originally required 2.5 at the bare minimum to enable its global illumination system. Add to that the fact that the PS4 GPU is driven by a CPU that is a joke compared to anything on the PC side of the fence. What people seem to have trouble accepting is the fact that these are both fairly weak consoles for a 1080P frame buffer especially if you are trying to pull off dynamic lighting. DriveClub is the exception and does it because it´s a linear racer. And apparently it sacrifices texture detail (no AF) and IQ (lots of aliasing) among other things. If the PS4 can´t run this game at 1080P on baked lighting you can bet there´s no chance in hell they could ever pull it off on a GI system...

The days when global illumination is the expected norm isn´t this generation yet. That´s what the next generation of consoles is going to be about among other things. I get that we are used to seeing dramatic performance increases between generations, but the industry has changed. Noone is selling a console at a loss anymore for years in hope of recovering their money 3-4 years later. It´s just not good business.
 

BigDug13

Member
How is this different from other multiplatform games that have parity? Just because PS4 might be 900p?

The argument that the PS4 should be able to do more should also be valid for FIFA, Destiny etc. just because the xbox version is 1080p then we don't complain quite as much? (I know there were still complaints but nothing like this). Maybe it couldn't get a higher framerate if CPU bound, and there isn't a higher resolution to go to. But they could improve AA, or increase details of other aspects like shadows, AO etc.


The unfortunate and simplest reality is probably not conspiracy or parity clauses, but simply time and ambition. If they've only just got it to 900/30, then maybe they want to just get the damn thing out of the door. Maybe they needed to put more people on the xbox version to get the performance up to that baseline and that resulted in them not being able to push further with the PS4 version just now?

On the bright side, maybe they'll be able to bring out a day one patch for PS4 like they did with black flag?

My TV is 1080p, not 900p. If they target 1080p like AC Unity did and end up at 1080p on both consoles, I don't complain. If they target 1080p and only get to 900p which is BELOW target on the more powerful console and also reach that exact same resolution below target on the weaker console, then I start to question where that extra power went. Because now instead of "reaching target", we have a situation where both systems reached exactly the same amount below target even though the spec difference means that these systems should not be equally missing the target resolution and framerate.

If a developer reaches their target on both consoles like Destiny, then fantastic. Everybody wins. If a developer can't even reach their target and seemingly doesn't take advantage of the more powerful console, reaching the exact same "below target" plateau, then we start to question it.
 

Averon

Member
So many PR schmucks seem to take this angle of "Just play the game! 900p's just a number!" Which is true, I guess, but it's a number that's directly linked to how good a game looks. Higher number = prettier game, simple as that. They spend all this time talking up how amazing their graphics are, but when they can't reach 1080p all of a sudden it's "Oh don't be so shallow with your numbers, are you playing the graphics or the game?"

Then when the game comes out and we find it's a choppy mess at 20-something frames a second anyway, there's no culpability because we've already paid our money and Ubisoft's already gearing up for next year's game.

These developers want to have their cake and eat it too. They want to impress us about how great their latest graphic engine is, how the lighting is spectacular, how the cloth physics are more realistic than ever!!! I have yet to see a marketing campaign for a game that doesn't mention how great their game looks in video at least a few times.

But at the same time they want us to disregard resolution and frame rate, two aspects that directly impact the visual package of a game.

So which is it? Do they want us to care and how visually pleasing their game is or not?
 

Marlenus

Member
You are aware how much power this actually takes? There´s a reason we are seeing a lot of baked lighting and will continue to. The PS4 is not a powerhouse. 1.8 TF GPU isn´t much when Unreal 4 originally required 2.5 at the bare minimum to enable its global illumination system. Add to that the fact that the PS4 GPU is driven by a CPU that is a joke compared to anything on the PC side of the fence. What people seem to have trouble accepting is the fact that these are both fairly weak consoles for a 1080P frame buffer especially if you are trying to pull off dynamic lighting. DriveClub is the exception and does it because it´s a linear racer. And apparently it sacrifices texture detail (no AF) and IQ (lots of aliasing) among other things. If the PS4 can´t run this game at 1080P on baked lighting you can bet there´s no chance in hell they could ever pull it off on a GI system...

The days when global illumination is the expected norm isn´t this generation yet. That´s what the next generation of consoles is going to be about among other things. I get that we are used to seeing dramatic performance increases between generations, but the industry has changed. Noone is selling a console at a loss anymore for years in hope of recovering their money 3-4 years later. It´s just not good business.

I am not saying all games should do dynamic GI, I just expect more to, especially once GPU compute starts to take off. The issue is based on what Ubi have told us they can run this game at 1080p on the PS4 and get the same framerate as XBox One does at 900p, they just chose not to bother and are now spouting BS for why they cannot.
 

vrln

Neo Member
We obviously need to wait and see the comparisons on image quality and performance. Resolution isn't the entire picture as we've seen before.

However complaining about being CPU bound when there is untapped potential in GPGPU performance is strange. Unless I've missed something...?

http://twvideo01.ubm-us.net/o1/vault/gdceurope2014/Presentations/828884_Alexis_Vaisse.pdf

GPGPU isn´t just something you can easily plug in to boost general CPU performance. It takes time to learn and these are still early games for these systems. Special platform specific features like GPGPU aren´t generally used much in multiplatform games, especially not this early in the generation.

Major GPGPU work for now is mostly confined to first party development. It´s the PS4´s end game plan. Infamous does it among other things if I remember correctly, but it´s not really realistic to expect a third party game to start doing it this early.
 
I think people are overreacting. Developing a multiplatform game requires time and sacrifices while balancing both platforms. Based on tech articles, Vinces's statement which people say is true technical wise being CPU bound, and previous builds of AC had this game at 792p on the PS4 and Xbox One and now come sooner to release, they probably decided to lock the game at 900p and prioritize development on polish with little time to squeeze out the PS4. This is really logical. No conspiracy theories needed.

Multiplatform games have tough schedules. They could have simply just not have enough time to squeeze out PS4 performance while working on the game.

It''s not magic for a multiplatform game to run better on the PS4. It takes time and resources.

Don't seem to be an issue with many of the multiplat games. Last I checked, it was the X1 that required said "time" to pull the performance from.
 
His ideas explanation should be fairly easy to refute.

Test it on PC with equivalent CPUs and GPUs and let's see if runs virtually the same. I am guessing it won't...
 
Why does running at 900/30 on the bone confirm that Unity absolutely must be able to run at 1080/30 or 900/60 on PS4 for some people?

Is there even a shred of proof?

THE PS4s hardware is more powerful, that is a given, but where is the proof in the rage posts that this power difference definitively proves that this game can run at 1080p with a steady 30 FPS or that it can hold a steady 60 FPS at the same 900p resolution?

Anyone?

What if the engine will only run at 25fps at 1080p? Is that preferable? Would you prefer if they began to remove visual enhancements one at a time till they reached a completely steady 30 FPS with no drops?

Furthermore, how do we know the bone version has as stable a frame rate as the ps4 version? Or as many or as good visual enhancements as the PS4 version?
 

ReBurn

Gold Member
I wish I was a game developer and knew how to build software for the PS4 and Xbox One. At least that way I would have some sort of relevant frame of reference to call bullshit. But since I'm not I will just say that I don't care what the native resolution of the game is. I just hope that it is fun to play.

His ideas explanation should be fairly easy to refute.

Test it on PC with equivalent CPUs and GPUs and let's see if runs virtually the same. I am guessing it won't...

I'm not quite sure that's how it works.
 
GPGPU isn´t just something you can easily plug in to boost general CPU performance. It takes time to learn and these are still early games for these systems. Special platform specific features like GPGPU aren´t generally used much in multiplatform games, especially not this early in the generation.

Major GPGPU work for now is mostly confined to first party development. It´s the PS4´s end game plan. Infamous does it among other things if I remember correctly, but it´s not really realistic to expect a third party game to start doing it this early.

Thanks that makes sense. It does seem like Ubi are at least contemplating using it in the future though, if those slides are anything to go by.
 

driver116

Member
The difference in power is probably linked to optimisation, in which PS4 takes less time to optimise to 900p/30fps.

Something like this:

iJKe5880sNtUe.png
 
Why does running at 900/30 on the bone confirm that Unity absolutely must be able to run at 1080/30 or 900/60 on PS4 for some people?

Is there even a shred of proof?

THE PS4s hardware is more powerful, that is a given, but where is the proof in the rage posts that this power difference definitively proves that this game can run at 1080p with a steady 30 FPS or that it can hold a steady 60 FPS at the same 900p resolution?

Anyone?

I bet this has been answered more than once actually. Anyway: PS4 has about 40% more GPU power and also more memory bandwidth/no hard size limit due to ESRAM. 1080p has 44% more pixels than 900p. Even if it'd scale 1:1 with resolution (in reality it's usually a little less than that) the PS4 would manage 1080p at about the same frame rate. Furthermore the GPU architectures are identical, so most optimizations done for the Xbox One version in this regard should directly translate over to the PS4 version. My opinion: 1080p should be possible on PS4 without all that much of extra effort. Alternatively they could stay at the same resolution but use the extra power for higher res shadow maps or the like. Or maybe the Xbox One doesn't run at a stable 30 fps in GPU bound situations, but the PS4 version could.

Now 900p at 60 fps would be a entirely different matter. You'd need twice of everything, including CPU power. Only possible on PC.

Furthermore, how do we know the bone version has as stable a frame rate as the ps4 version? Or as many or as good visual enhancements as the PS4 version?

We don't (though nothing hints at it so far). We'll have to wait and see.
 

Jomjom

Banned
I wish Sony would just send a ICE team over to Ubi and offer to optimize the game a la MS with Destiny.

If Ubi turns them away at the door, they'll know it's something contractual, if they don't and ICE gets it to 1080p they'll know it was because the AC:U team is either incompetent or didn't have enough time.
 

vrln

Neo Member
Thanks that makes sense. It does seem like Ubi are at least contemplating using it in the future though, if those slides are anything to go by.

Eventually as knowledge and tools build up multiplatform games will optimise more and more specific to the platform. Unity is still a game that was in development before the official specifications were even known. The first "real" games designed from the ground up for these specifications from the very beginning will be out next year or so. Also the XB1 has GPGPU too, just not as much. It will be a long term thing for both. The projects that only started this year, those are the ones that will really push graphics really far.

As a technology geek first party development is what I mostly like to follow (and play). The clear difference in power between the consoles is a lot less apparent there (I would even claim you can´t see it at all right now). That´s where you see the real magic happening. FH2 is a stunning example of how good you can do when the entire engine is built up specific for the XB1 and it´s still deep down based on an engine that was designed when final hardware wasn´t available. My hope is that both platforms do tons of first party development. Can´t wait to see what technological powerhouses like Naughty Dog and 343 Industries can cook up :)
 
Furthermore the GPU architectures are identical, so most optimizations done for the Xbox One version in this regard should directly translate over to the PS4 version. .

I doubt that is entirely true where optimisation relates to specific cases within each console's APIs. Unless you're talking about generic middleware.
 

vrln

Neo Member
I wish Sony would just send a ICE team over to Ubi and offer to optimize the game a la MS with Destiny.

If Ubi turns them away at the door, they'll know it's something contractual, if they don't and ICE gets it to 1080p they'll know it was because the AC:U team is either incompetent or didn't have enough time.

This is a good point (although I disagree with the incompetent/not enough time part). So far there´s reason to assume Sony has not started doing hands on assistance on this level yet and it´s hurting them. No third party developer can be as good at optimising a game for a specific platform as that platform´s own optimisation specialists. That has nothing to do with incompetence, it´s just that there´s no incentive to specialise that far if you are doing multiplatform work. And besides it would probably be impossible too - teams like ICE/"MS tech squad" that went to Bungie have spent tons of time getting to that expertise level. They have access to the silicon developers etc etc.

We don´t have any developers saying on record that Sony sent them a bunch of specialists that helped them hit a target. It´s brilliant business to help AAA games reach parity (MS) or go even further (Sony), but also very expensive. I don´t mean this as a flame bait, but the argument can be made that Sony has not been very active compared to MS after launching the console. Where are the OS updates? Maybe Sony are bleeding too much money to afford this. They designed the more powerful hardware, but after that they seem to have dropped the ball somewhat. Just imagine how awesome PS4 would be if it had substantial monthly OS updates and Sony would assist the most important third party games on the level MS apparently does (those guys are wizards to get Destiny running at 1080P on a system that is less powerful).
 

Jomjom

Banned
This is a good point. So far there´s reason to assume Sony has not started doing hands on assistance on this level yet and it´s hurting them. No third party developer can be as good at optimising a game for a specific platform as that platform´s own optimisation specialists. That has nothing to do with incompetence, it´s just that there´s no incentive to specialise that far if you are doing multiplatform work. And besides it would probably be impossible too - teams like ICE/"MS tech squad" that went to Bungie have spent tons of time getting to that expertise level. They have access to the silicon developers etc etc.

We don´t have any developers saying on record that Sony sent them a bunch of specialists that helped them hit a target. It´s brilliant business to help AAA games reach parity (MS) or go even further (Sony), but also very expensive. I don´t mean this as a flame bait, but the argument can be made that Sony has not been very active compared to MS after launching the console. Where are the OS updates? Maybe Sony are bleeding too much money to afford this. They designed the more powerful hardware, but after that they seem to have dropped the ball somewhat. Just imagine how awesome PS4 would be if it had substantial monthly OS updates and Sony would assist the most important third party games on the level MS apparently does (those guys are wizards to get Destiny running at 1080P on a system that is less powerful).

Yeah this is why my dreams consist of MS and Sony (and Nintendo too why not) joining forces and just creating a console that combines all of their strengths.
 

Dryk

Member
That tone reeks of an engineer who has started to realise he bit off more than he could chew in the early design stages but hasn't past the denial stage because the project isn't over.
 

RVinP

Unconfirmed Member
Anyway: PS4 has about 40% more GPU power and also more memory bandwidth/no hard size limit due to ESRAM. 1080p has 44% more pixels than 900p. Even if it'd scale 1:1 with resolution (in reality it's usually a little less than that) the PS4 would manage 1080p at about the same frame rate.

That's entirely arbitrary, unless one knows in-outs about this specific game's engine and I presume none here knows (or even if they did, they'd be tight lipped about it).

Edit: This threads topic couldn't be more accurate, its 900p drama alright.
 

Nzyme32

Member
The difference in power is probably linked to optimisation, in which PS4 takes less time to optimise to 900p/30fps.

Something like this:

iJKe5880sNtUe.png

Fuck sake. Are we really at the point where you need to represent your position with a childish graph of non-data.

Fucking hell. These posts are laughable
 

vrln

Neo Member
Yeah this is why my dreams consist of MS and Sony (and Nintendo too why not) joining forces and just creating a console that combines all of their strengths.

A single console future could be nice... I have some reservations about the lack of hardware competition, but when I think about it I might be wrong. Especially if that means we can go back to how consoles were designed previously (sold at a loss in the beginning). Perhaps this way would be the only path to getting a console that´s as powerful as they used to be at launch. Right now it´s basically a "pick your poison" situation: PS4 for more powerful hardware with weak software support or XB1 for less powerful hardware with strong software support.
 
I doubt that is entirely true where optimisation relates to specific cases within each console's APIs. Unless you're talking about generic middleware.

Of course you can't port the code 1:1 when you're using different APIs or shader languages, but still the underlying hardware remains the same. Stuff like favorable memory layouts, caching optimizations will benefit both versions. And that's just the platform/hardware specific side. Clever algorithms in general are often independent of that.
Honestly: If it generally was that difficult we'd see more problems with PC versions of games. It's supposedly the weakest platform in terms of sales while also being the most challenging in terms of different hardware to support. Still most multiplatform games run just fine on hardware that is equivalent to the consoles, at least when looking at the GPU side (the CPU overhead on PC is more of a problem).


edit:
That's entirely arbitrary, unless one knows in-outs about this specific game's engine and I presume none here knows (or even if they did, they'd be tight lipped about it).

It's really not. Try asking some of the more respected/well known members like Durante on this forum if you don't believe it.
 

c0de

Member
GPGPU isn´t just something you can easily plug in to boost general CPU performance. It takes time to learn and these are still early games for these systems. Special platform specific features like GPGPU aren´t generally used much in multiplatform games, especially not this early in the generation.

Major GPGPU work for now is mostly confined to first party development. It´s the PS4´s end game plan. Infamous does it among other things if I remember correctly, but it´s not really realistic to expect a third party game to start doing it this early.

Not only this but GPGPU takes hardware resources which are still also to do, well, GPU work. It depends on how much devs will "sacrifice" from the GPU resources to spend for GPGPU tasks. There is no "free, untouched" hardware waiting. And resources you "offer" for GPGPU are also influenced by resources other parts of the GPU use (i.e. ram and its bandwidth). Some people see GPGPU as resources idling around currently and waiting to get touched.
 

hateradio

The Most Dangerous Yes Man
The last part was so awful.

I don't see why UBI didn't reduce the amount of NPCs. Do we really need thousands of people on there at all times? I don't see why we do.


The difference in power is probably linked to optimisation, in which PS4 takes less time to optimise to 900p/30fps.

Something like this:

iJKe5880sNtUe.png
This is awesome.
I assume it's a joke.
 

iNvid02

Member
The last part was so awful.

I don't see why UBI didn't reduce the amount of NPCs. Do we really need thousands of people on there at all times? I don't see why we do.

i doubt its at all times, they probably turn it all the way up during story missions to add to the atmosphere, and no doubt it will make paris feel even more alive and ready to revolt. seeing that huge crowd down there from atop the tower was pretty impressive i gotta say.

The difference in power is probably linked to optimisation, in which PS4 takes less time to optimise to 900p/30fps.

Something like this:

iJKe5880sNtUe.png

i like this post a lot, straight to the point and easy to digest. 10/10
 

Chev

Member
Funny how this is the same Ubisoft that released a presentation about offloading clothes simulation from CPU to GPGPU. The CPU calc was too weak for this so GPGPU was the solution.

http://gdcvault.com/play/1020939/Efficient-Usage-of-Compute-Shaders

Page 120 : PS4 is deemed more powerful on GPGPU side. Funny how for their flagship game they decided to take the CPU bound approach.

Different use cases, nothing funny about that. Crowd AI versus clothes, ie gameplay vs non-gameplay code. GPGPU's great strength is its parallelism, but it comes with a great limitation: your CPU and GPU act as separate systems and getting results back from the GPU to the CPU potentially costs a lot of performance and is really tricky to optimize.

That's why the stuff that's good to offload to GPGPU is visuals and fluff. Particles, clothes, illumination, all those are fine because you can keep them in a black box, both computing them and rendering them on the GPU, because they don't factor into gameplay. But anything the CPU is gonna need to have an active look at is gonna cost you just to get the information back. So in many of those cases you'll fine you get better performance being GPU-bound instead of waiting for the GPU to deliver.
 

c0de

Member
Funny how this is the same Ubisoft that released a presentation about offloading clothes simulation from CPU to GPGPU. The CPU calc was too weak for this so GPGPU was the solution.

http://gdcvault.com/play/1020939/Efficient-Usage-of-Compute-Shaders

Page 120 : PS4 is deemed more powerful on GPGPU side. Funny how for their flagship game they decided to take the CPU bound approach.

Interesting bits in this:
- Xbox One CPU more powerful than PS4's
- slides only mention PS4 optimization
- slides say approach relies heavily on memory bw
- slides don't mention ESRAM in any way
 

Aliand

Banned
Different use cases, nothing funny about that. Crowd AI versus clothes, ie gameplay vs non-gameplay code. GPGPU's great strength is its parallelism, but it comes with a great limitation: your CPU and GPU act as separate systems and getting results back from the GPU to the CPU potentially costs a lot of performance and is really tricky to optimize.

That's why the stuff that's good to offload to GPGPU is visuals and fluff. Particles, clothes, illumination, all those are fine because you can keep them in a black box, both computing them and rendering them on the GPU, because they don't factor into gameplay. But anything the CPU is gonna need to have an active look at is gonna cost you just to get the information back. So in many of those cases you'll fine you get better performance being GPU-bound instead of waiting for the GPU to deliver.

Yeah but Ubisoft's comment shifted from "Dude the AI is eating the CPU" to "Dat 25GB Lighting!!!"
 

Aliand

Banned
Interesting bits in this:
- Xbox One CPU more powerful than PS4's
- slides only mention PS4 optimization
- slides say approach relies heavily on memory bw
- slides don't mention ESRAM in any way

Overcloak + Kinect effect.
Is the ESRAM a bottleneck of some sort? The 25Go/s bandwith might be killing it?
 

c0de

Member
Overcloak + Kinect effect.
Is the ESRAM a bottleneck of some sort? The 25Go/s bandwith might be killing it?

We just don't know as the slides don't mention it. But how they are done it seems optimization went especially for PS4 while Xbox One results are just shown in numbers with no other mention at all. We don't know if they used ESRAM at all.
 

martino

Member
We just don't know as the slides don't mention it. But how they are done it seems optimization went especially for PS4 while Xbox One results are just shown in numbers with no other mention at all. We don't know if they used ESRAM at all.

Would'nt it be more surprising if they don't use it ?
 

Data West

coaches in the WNBA
If the game is as pretty and fun as ours will be, who cares?
Mordor has next gen system and gameplay, but not graphics like Unity does.
The result is amazing graphically, the depth of field and lighting effects are beyond anything you've seen on the market, and even may surpass Infamous and others.

'It's not about graphics. But here's how our graphics are totally better!
 

Marlenus

Member
Why does running at 900/30 on the bone confirm that Unity absolutely must be able to run at 1080/30 or 900/60 on PS4 for some people?

Is there even a shred of proof?

THE PS4s hardware is more powerful, that is a given, but where is the proof in the rage posts that this power difference definitively proves that this game can run at 1080p with a steady 30 FPS or that it can hold a steady 60 FPS at the same 900p resolution?

Anyone?

What if the engine will only run at 25fps at 1080p? Is that preferable? Would you prefer if they began to remove visual enhancements one at a time till they reached a completely steady 30 FPS with no drops?

Furthermore, how do we know the bone version has as stable a frame rate as the ps4 version? Or as many or as good visual enhancements as the PS4 version?

Results are from TPU http://www.techpowerup.com/reviews/S..._Dual-X/6.html/

-------------------- 260X @ 900p ----- 265 @ 1080p
ACIV ----------- 25.7 ------------------- 27.7
Batman: AO - 32.2 ------------------- 66.5
Battlefield 3 -- 53.4 ------------------- 52.6
Battlefield 4 -- 35.7 ------------------- 35.2
BS: Infinite --- 68.0 ------------------- 61.1
COD:G --------- 64.3 ------------------- 60.8
COJ:Gunslinger 128.3 -------------------127.8
Crysis ---------- 43.3 ------------------- 41.8
Crysis 3 ------- 20.8 ------------------- 20.8
Diablo 3 ------- 88.2 ------------------- 104.9
Far Cry 3 ----- 26.7 ------------------- 25.4
Metro:LL ----- 38.3 ------------------- 35.2
Splinter Cell - 27.1 ------------------- 29.6
Tomb Raider - 27.0 ------------------- 23.6
WoW ----------- 66.9 ------------------- 74.9

In no circumstance above does going from 900p on the 260x to 1080p on the 265 result in a massive frame rate drop. Since this is a PC based benchmark we are comparing just the GPU differences and there is nothing else that will skew the results. The 260x is a bit above the Xbox One GPU in terms of performance so the results you see above are closer than what they would be if we had a closer match to the Xbox One GPU. The 265 is practically the same as the PS4 GPU.

I also need to be clear here and state that these results are not any kind of expected performance target for the consoles because PCs are different to the consoles. I am just saying that these are the relative GPU differences in a real world scenario.

In the console sphere the PS4 has less overhead than the Xbox One in terms of its OS and API leading to slightly increased CPU performance despite the slightly slower clock speed. When you add that to the above information it is pretty conclusive proof that what the XBox One can do at 900p the PS4 can do at 1080p.

It also highlights a scenario (Batman) where you get a much larger gap than expected, perhaps whatever causes this gap in Batman is the same as what causes the gap in Fox Engine based games. Looking at the specs of the GPUs used here I would put that gap down to either bandwidth (in which case perhaps Xbox One can get a speed bump in Fox Engine with better ESRAM usage) or the limited number of ROPS..
 
Top Bottom