• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: The Witcher 3 patch 1.03 performance analysis (PS4/XB1)

I'm not mocking you. I only bring up the PC version because it's relevant to what you're talking about re: GPUs.

You're really oversimplifying things by comparing this to all other ps4 games. It's one of the most demanding games this gen so far, probably only 2nd to AC:U. It's not so cut and dry as: "well infamous runs at 1080/30, so this should too!". That doesn't really make sense.

None of this explains how the ps4 versions performance is worse than the performance of a significantly less powerful platform. My point all along is that its a result of bad development not the ps4 version being at 1080p or the ps4 gpu not being able to handle it despite a much less powerful gpu in its main competitor inexplicably running it better.
And assassins creed unity is an even worse example of developer incompetence when you consider that both ps4/xbox one run it at 900p and as if by magic both run it at roughly the same framerate.
 

cakely

Member
Isn't it true that neither MS nor AMD had ever revealed what GPU is in the XB1 exactly? How do we know the difference is 40%? I mean, the GPU in the XB1 was custom made, correct?

Maybe there's a second, hidden GPU core stacked in the Xbox One? I think we're on to something, here.
 

ethomaz

Banned
Isn't it true that neither MS nor AMD had ever revealed what GPU is in the XB1 exactly? How do we know the difference is 40%? I mean, the GPU in the XB1 was custom made, correct?
MS did reveled the specs officially.

Both are GCN based... the chip CPU + GPU + eSRAM is custom made.
 
I'm not mocking you. I only bring up the PC version because it's relevant to what you're talking about re: GPUs.

You're really oversimplifying things by comparing this to all other ps4 games. It's one of the most demanding games this gen so far, probably only 2nd to AC:U. It's not so cut and dry as: "well infamous runs at 1080/30, so this should too!". That doesn't really make sense.

Much more demanding than AC:Unity. You don't see several alpha masked sprites along with physics and full dynamic shadows cast from TOD in Unity.
 

ethomaz

Banned
Thats more than enough of a power advantage to run the game at least at the same framerate as the xbox one with same effects/assets.
Thats the point I have been making.
At 1080p? Because I guess at 900p the game could run at 40% better framerate than Xbox One.

So what is your point?
 

Seanspeed

Banned
None of this explains how the ps4 versions performance is worse than the performance of a significantly less powerful platform.
As has been pointed out many times to you already, the PS4 version has to render the game at 1080p, which, at 44% more pixels than 900p, uses a substantial amount of GPU power right there. Therefore there isn't always enough horsepower left to improve the framerate, too. Sometimes there is, sometimes there isn't. Sometimes it may even leave your performance worse off. It does not work the same with every game for various reasons.

You are confused as to what having a more powerful GPU actually means, I think.
 
At 1080p? Because I guess at 900p the game could run at 40% better framerate than Xbox One.

So what is your point?

That the ps4 gpu as enough extra power compared to the xbox one gpu to run the same game at 1080p at the same or better framerate that the xbox one runs at 900p.
 

Conduit

Banned
At 1080p? Because I'm sure at 900p the game could run at 40% better framerate than Xbox One.

So what is your point?

Yes, @1080p. PS4's GPU has enough resources ( higher texel and pixel fillrate ) to run game in Full HD ( higher ROPS number ) with same assets and same ( or better ) framerate.

Well, PS4 version of W3 has better AF and AA. Also higher resolution. I believe also there is a plenty of remaining space to run game at stable 30 fps.
 

cakely

Member
That the ps4 gpu as enough extra power compared to the xbox one gpu to run the same game at 1080p at the same or better framerate that the xbox one runs at 900p.

As it has been pointed out: The PS4 version is rendering 44% more pixels than the Xbox One version of Witcher 3.
 

Seanspeed

Banned
That the ps4 gpu as enough extra power compared to the xbox one gpu to run the same game at 1080p at the same or better framerate that the xbox one runs at 900p ?
Possibly, but you cant know that. Naming examples of other games doesn't work cuz games dont all work the same way.

It's one thing to speculate that it could. Hell, I feel that is quite likely myself. But I wouldn't go around acting sure of myself cuz I would still just be running under an assumption that might be incorrect. And then it gets worse when you take an assumption as the truth and then jump to further conclusions based off of that assumption. It becomes a very weak support structure for a belief or argument.
 

ethomaz

Banned
That the ps4 gpu as enough extra power compared to the xbox one gpu to run the same game at 1080p at the same or better framerate that the xbox one runs at 900p.
Proof? Because what I see in GPU world (PC) 40% more horsepower doesn't mean 40% better performance... you need to see if have more bottlenecks like CPU for example.

Said that.

The difference in resolution is 44% so even in the best case scenario (ideal) the PS4 game will run @1080p at 3-5 frames below what it runs @900p in Xbox One.
 

Seanspeed

Banned
Proof? Because what I see in GPU world (PC) 40% more horsepower doesn't mean 40% better performance... you need to see if have more bottlenecks like CPU for example.

Said that.

The difference in resolution is 44% so even in the best case scenario (ideal) the PS4 game will run @1080p at 3-5 frames below what it runs @900p in Xbox One.
See, even here, you just cant use the math to come to hard figures like this.

You can use them as rough guides, but that's about it. You're kind of making the same mistake he is, thinking that you can figure out exactly how much better a game should be based on numbers that are really only rough figures themselves(not every GPU is built or works the same, nor is 'power' any one specific metric).
 

Conduit

Banned
Proof? Because what I see in GPU world (PC) 40% more horsepower doesn't mean 40% better performance... you need to see if have more bottlenecks like CPU for example.

Said that.

The difference in resolution is 44% so even in the best case scenario (ideal) the PS4 game will run @1080p at 3-5 frames below what it runs @900p in Xbox One.

Hell no. Higher bandwidth, better GPU in every possible way, over 500 GFLOPS difference and just 'tiny' 180p makes the difference in framerate in favor of weaker console? That's hilarious!
 
Hell no. Higher bandwidth, better GPU, over 500 GFLOPS difference and just 'tiny' 180p make the difference in framerate?
It depends what the load balance on the GPU is. What part of the GPU is hardest hit. So in some situations, a PS4 could do better framerate with tiny better effect works. In other situations... it could have problems doing that @ 1080p. You also have certain flat clsts that do not really scale linearly.

So it is different from game to game and also different from scene to scene at times.
 

ethomaz

Banned
See, even here, you just cant use the math to come to hard figures like this.

You can use them as rough guides, but that's about it. You're kind of making the same mistake he is, thinking that you can figure out exactly how much better a game should be based on numbers that are really only rough figures themselves(not every GPU is built or works the same, nor is 'power' any one specific metric).
That is exactly what I said/think.

The hypothetical example is only if you have 100% scale across power... that is proved false in so many ways that in some cases a 40% better GPU can run only 20% better a game or even 60% other game... each case is a different case.
 

Seanspeed

Banned
Hell no. Higher bandwidth, better GPU, over 500 GFLOPS difference and just 'tiny' 180p make the difference in framerate?
1600x900 vs 1920x1080 is actually a difference of 600,000 pixels needing to be rendered. Not actually that 'tiny' in terms of GPU demand.

That is exactly what I said/think.

The hypothetical example is only if you have 100% scale across power... that is proved false in so many ways that in some cases a 40% better GPU can run only 20% better a game or even 60% other game... each case is a different case.
Ah ok, my bad man.
 

ethomaz

Banned
Hell no. Higher bandwidth, better GPU in every possible way, over 500 GFLOPS difference and just 'tiny' 180p makes the difference in framerate in favor of weaker console? That's hilarious!
First 180p is 44% difference... not tiny.

Second GPU performance isn't linear scale.

Third, there are others thing that can hold performance too like CPU... if CPU is a bottleneck then even a 100% powerful GPU wont give you a good performance boost.
 
I have no experience with the game on PS4, but I was fine with the framerate on XB1 even before this patch. It runs very well for such a big game.
 

JimmyHoffa04

Neo Member
That the ps4 gpu as enough extra power compared to the xbox one gpu to run the same game at 1080p at the same or better framerate that the xbox one runs at 900p.

And this obviously is not the case. I think your perception of the power difference between the two consoles is off, a bit. Case and point, the power difference (for Witcher 3) is somewhere between 1080p and 900p, and 20-30 fps and 25-30+ fps. Not all that different. And if you asked most people to tell you which is which (this is true for most games this gen), they couldn't do it. That's the difference.
 

nib95

Banned
If they get rid of the double buffering and 20fps cap with the PS4 version, I'd be more than happy with it. In all honesty the frame rate is better than most open world games were last gen, and if it weren't for just a few areas in the game, the frame rate would be acceptable. If it drops below 30fps in a more tangible way, just let it drop to whatever it needs to, instead of auto locking to an even lower frame rate. The latter method is baffling.
 
PS4's GPU has 50% more ROP's. Enough for 1080p. What with the rest of GPU resources? Not enough for stable 30fps?

Are you being serious? Any arbitrary GPU spec number doesnt mean it "is enough for 1080p". It depends on so many factors in what the game is rendering.
 

ethomaz

Banned
PS4's GPU has 50% more ROP's. Enough for 1080p. What with the rest of GPU resources? Not enough for stable 30fps?
Enough for 1080p? Seems like you are assuming there aren't needed of others resources except ROPs for 1080p.

Just a bit of thinking to you... and if what is holding the framerate in 1080p is the CPU and the GPU is running below what it can? Now change CPU for every other component/resource found in a system.
 

Conduit

Banned
Enough for 1080p? Seems like you are assuming there aren't needed of others resources except ROPs for 1080p.

Just a bit of thinking to you... and if what is holding the framerate in 1080p is the CPU and the GPU IS running below what it can? Now chance CPU for every other component/resource found in a system.

Whatever, guys! If CDPR fix framerate issues trough patches on PS4 without touching any other graphical elements, well...
 
And this obviously is not the case. I think your perception of the power difference between the two consoles is off, a bit. Case and point, the power difference (for Witcher 3) is somewhere between 1080p and 900p, and 20-30 fps and 25-30+ fps. Not all that different. And if you asked most people to tell you which is which (this is true for most games this gen), they couldn't do it. That's the difference.

Its not the case due to the developer screwing it up, nothing else.
 
I just assume that because the Witcher 3 has a marketing deal with Microsoft they focused harder on the Xbox One version.

It always seems to be those games which are relatively better on Xbox One, and this one is no exception.
 

nib95

Banned
And this obviously is not the case. I think your perception of the power difference between the two consoles is off, a bit. Case and point, the power difference (for Witcher 3) is somewhere between 1080p and 900p, and 20-30 fps and 25-30+ fps. Not all that different. And if you asked most people to tell you which is which (this is true for most games this gen), they couldn't do it. That's the difference.

This isn't really accurate if you're trying to gauge performance differences based solely on this title. The PS4 version auto caps to 20fps because of the buffering method. If that were removed, we don't really know what the lower end of the frame rate scale would look like with the PS4 version.
 
I just assume that because the Witcher 3 has a marketing deal with Microsoft they focused harder on the Xbox One version.

It always seems to be those games which are relatively better on Xbox One, and this one is no exception.

The other major example of this is Assassins Creed Unity which is even more ridiculous considering both ps4/xbox one run it at 900p and at the same framerate.
 

Seanspeed

Banned
Whatever, guys! If CDPR fix framerate issues trough patches on PS4 without touching any other graphical elements, well...
Well what? Then your assumption would be correct. Doesn't change the fact that you are trying to pass off an assumption as fact.

Its not the case due to the developer screwing it up, nothing else.
Everything we've just said has gone in one ear and out the other, hasn't it? Ugh.
 

RoadHazard

Gold Member
Enough for 1080p? Seems like you are assuming there aren't needed of others resources except ROPs for 1080p.

Just a bit of thinking to you... and if what is holding the framerate in 1080p is the CPU and the GPU is running below what it can? Now change CPU for every other component/resource found in a system.

The CPU has nothing to do with shading pixels and pushing them to the screen.
 

nib95

Banned
I'm not one to tell someone how to develop a game, but PS4 would benefit from 900p..just my opinion. At least dynamic scaling.

Did you watch the video? The frame rate stays at 30fps the vast majority of the time. Just like with GTA V and Unity, I'm confident CDPR will keep improving the frame rate and performance without having to sacrifice on the resolution.
 
Hell no. Higher bandwidth, better GPU in every possible way, over 500 GFLOPS difference and just 'tiny' 180p makes the difference in framerate in favor of weaker console? That's hilarious!

Just a measly 180p difference. Come on son. What's your answer then? CDPR hates PS4? Or the nefarious MS paid them off?
 

Journey

Banned
All we hear about from developers now is how they want to make the best possible versions on each platform they release on.
And then we get a version that performs significantly worse on one platform which is significantly more powerful than the other.
I stand by my comment. If one version gets less attention to performance than the others than thats laziness with regards to that platform. Otherwise its incompetence.


Incompetence on their part, or ignorance on yours? I think the latter.

PS4 version is running at 1920 x 1080. If they were both running 900p and Xbox One had better performance, you'd have a small point. TW3 is a pretty amazing game, you're forgetting all of the amazing content and the amazing design, structure and pacing and you're simply focusing on a single aspect to call them lazy, lazy effort in gathering facts before posting is more like it. Simply insulting to the hard work the devs put in.
 

THRILLH0

Banned
If they get rid of the double buffering and 20fps cap with the PS4 version, I'd be more than happy with it. In all honesty the frame rate is better than most open world games were last gen, and if it weren't for just a few areas in the game, the frame rate would be acceptable. If it drops below 30fps in a more tangible way, just let it drop to whatever it needs to, instead of auto locking to an even lower frame rate. The latter method is baffling.

Well a frame rate that is capped at 30fps is preferable to one that fluctuates between 30-35 if digital foundry is to be believed so shouldn't the same logic apply to framerates below 30?
 

RoadHazard

Gold Member
CPU is bottleneck for framerate in a lot of games and I'm not saying it is for The Witcher 3.

Yes, but you brought it up specifically in the context of the game's framerate at 1080p, which a CPU bottleneck would have no impact on.

What about draw calls? Isn't that somewhat CPU dependant. Legit question. I know nothing like John Snow.

Making a lot of draw calls can be CPU-bound, yes, but that still has nothing to do with how many pixels you are rendering. You don't make more draw calls at higher resolutions - you still have the same amount of objects, materials, lighting effects, etc - the GPU just has more pixels to shade every frame.
 

tuxfool

Banned
Well a frame rate that is capped at 30fps is preferable to one that fluctuates between 30-35 if digital foundry is to be believed so shouldn't the same logic apply to framerates below 30?

Yes of course. Consistency is usually the best option. A fluctuation above 30 implies headroom available to lock to 30. A fluctuation below it takes a lot more work (or cuts) to bring it up to 30 locked.
 
Incompetence on their part, or ignorance on yours? I think the latter.

PS4 version is running at 1920 x 1080. If they were both running 900p and Xbox One had better performance, you'd have a small point. TW3 is a pretty amazing game, you're forgetting all of the amazing content and the amazing design, structure and pacing and you're simply focusing on a single aspect to call them lazy, lazy effort in gathering facts before posting is more like it. Simply insulting to the hard work the devs put in.

As I posted earlier, the fact that they spent a lot of time and work on the size of the world/content/story doesnt mean that they did spend enough time and resources getting all platforms performing well.
If one platform was neglected performance wise than that is indeed the fault of the developers.
I wonder how people would react if this game was developed in this state by a less beloved developer like for example Ubisoft. I bet a lot more of the blame would be directed at the devs if that were case.
 

Journey

Banned
As I posted earlier, the fact that they spent a lot of time and work on the size of the world/content/story doesnt mean that they did spend enough time and resources getting all platforms performing well.
If one platform was neglected performance wise than that is indeed the fault of the developers.
I wonder how people would react if this game was developed in this state by a less beloved developer like for example Ubisoft. I bet a lot more of the blame would be directed at the devs if that were case.


That's a big IF. The fact is you have no real evidence that anyting was neglected, and one could argue that they've achieved more than most expected from a platform with a mobile CPU and 1.8 TF GPU.
 

Journey

Banned
First 180p is 44% difference... not tiny.

Second GPU performance isn't linear scale.

Third, there are others thing that can hold performance too like CPU... if CPU is a bottleneck then even a 100% powerful GPU wont give you a good performance boost.

I wonder if these guys are reading these replies.

I bet all they do is this when common sense is spelled out for them:

skNuiEZ.jpg
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
The tears need to stop.

Just because many games run at a higher res and significantly higher FPS on PS4 doesn't mean they all are going to do that. Not all games are designed alike, or tailored to every console the same. Expecting that is just pure naivety.

Instead of name calling CDPROJECTRED with the same sort of conspiracy theories about marketing deals and whatever other nonsense, you should be hoping that they can improve performance for a stable 30 sometime soon
 
Top Bottom