I'm not mocking you. I only bring up the PC version because it's relevant to what you're talking about re: GPUs.
You're really oversimplifying things by comparing this to all other ps4 games. It's one of the most demanding games this gen so far, probably only 2nd to AC:U. It's not so cut and dry as: "well infamous runs at 1080/30, so this should too!". That doesn't really make sense.
Isn't it true that neither MS nor AMD had ever revealed what GPU is in the XB1 exactly? How do we know the difference is 40%? I mean, the GPU in the XB1 was custom made, correct?
MS did reveled the specs officially.Isn't it true that neither MS nor AMD had ever revealed what GPU is in the XB1 exactly? How do we know the difference is 40%? I mean, the GPU in the XB1 was custom made, correct?
It is a more powerful gpu of 40%. It also has to fill 44% more pixels than the xb1.
I'm not mocking you. I only bring up the PC version because it's relevant to what you're talking about re: GPUs.
You're really oversimplifying things by comparing this to all other ps4 games. It's one of the most demanding games this gen so far, probably only 2nd to AC:U. It's not so cut and dry as: "well infamous runs at 1080/30, so this should too!". That doesn't really make sense.
At 1080p? Because I guess at 900p the game could run at 40% better framerate than Xbox One.Thats more than enough of a power advantage to run the game at least at the same framerate as the xbox one with same effects/assets.
Thats the point I have been making.
As has been pointed out many times to you already, the PS4 version has to render the game at 1080p, which, at 44% more pixels than 900p, uses a substantial amount of GPU power right there. Therefore there isn't always enough horsepower left to improve the framerate, too. Sometimes there is, sometimes there isn't. Sometimes it may even leave your performance worse off. It does not work the same with every game for various reasons.None of this explains how the ps4 versions performance is worse than the performance of a significantly less powerful platform.
At 1080p? Because I guess at 900p the game could run at 40% better framerate than Xbox One.
So what is your point?
At 1080p? Because I'm sure at 900p the game could run at 40% better framerate than Xbox One.
So what is your point?
That the ps4 gpu as enough extra power compared to the xbox one gpu to run the same game at 1080p at the same or better framerate that the xbox one runs at 900p.
Possibly, but you cant know that. Naming examples of other games doesn't work cuz games dont all work the same way.That the ps4 gpu as enough extra power compared to the xbox one gpu to run the same game at 1080p at the same or better framerate that the xbox one runs at 900p ?
Proof? Because what I see in GPU world (PC) 40% more horsepower doesn't mean 40% better performance... you need to see if have more bottlenecks like CPU for example.That the ps4 gpu as enough extra power compared to the xbox one gpu to run the same game at 1080p at the same or better framerate that the xbox one runs at 900p.
See, even here, you just cant use the math to come to hard figures like this.Proof? Because what I see in GPU world (PC) 40% more horsepower doesn't mean 40% better performance... you need to see if have more bottlenecks like CPU for example.
Said that.
The difference in resolution is 44% so even in the best case scenario (ideal) the PS4 game will run @1080p at 3-5 frames below what it runs @900p in Xbox One.
Proof? Because what I see in GPU world (PC) 40% more horsepower doesn't mean 40% better performance... you need to see if have more bottlenecks like CPU for example.
Said that.
The difference in resolution is 44% so even in the best case scenario (ideal) the PS4 game will run @1080p at 3-5 frames below what it runs @900p in Xbox One.
Thats more than enough of a power advantage to run the game at least at the same framerate as the xbox one with same effects/assets.
Thats the point I have been making.
It depends what the load balance on the GPU is. What part of the GPU is hardest hit. So in some situations, a PS4 could do better framerate with tiny better effect works. In other situations... it could have problems doing that @ 1080p. You also have certain flat clsts that do not really scale linearly.Hell no. Higher bandwidth, better GPU, over 500 GFLOPS difference and just 'tiny' 180p make the difference in framerate?
That is exactly what I said/think.See, even here, you just cant use the math to come to hard figures like this.
You can use them as rough guides, but that's about it. You're kind of making the same mistake he is, thinking that you can figure out exactly how much better a game should be based on numbers that are really only rough figures themselves(not every GPU is built or works the same, nor is 'power' any one specific metric).
1600x900 vs 1920x1080 is actually a difference of 600,000 pixels needing to be rendered. Not actually that 'tiny' in terms of GPU demand.Hell no. Higher bandwidth, better GPU, over 500 GFLOPS difference and just 'tiny' 180p make the difference in framerate?
Ah ok, my bad man.That is exactly what I said/think.
The hypothetical example is only if you have 100% scale across power... that is proved false in so many ways that in some cases a 40% better GPU can run only 20% better a game or even 60% other game... each case is a different case.
First 180p is 44% difference... not tiny.Hell no. Higher bandwidth, better GPU in every possible way, over 500 GFLOPS difference and just 'tiny' 180p makes the difference in framerate in favor of weaker console? That's hilarious!
That the ps4 gpu as enough extra power compared to the xbox one gpu to run the same game at 1080p at the same or better framerate that the xbox one runs at 900p.
First 180p is 44% difference... not tiny.
Second GPU performance isn't linear scale.
PS4's GPU has 50% more ROP's. Enough for 1080p. What with the rest of GPU resources? Not enough for stable 30fps?
Enough for 1080p? Seems like you are assuming there aren't needed of others resources except ROPs for 1080p.PS4's GPU has 50% more ROP's. Enough for 1080p. What with the rest of GPU resources? Not enough for stable 30fps?
And again
We have one consoles which struggles at 1080p
And another one which doesn´t even try.
Enough for 1080p? Seems like you are assuming there aren't needed of others resources except ROPs for 1080p.
Just a bit of thinking to you... and if what is holding the framerate in 1080p is the CPU and the GPU IS running below what it can? Now chance CPU for every other component/resource found in a system.
And this obviously is not the case. I think your perception of the power difference between the two consoles is off, a bit. Case and point, the power difference (for Witcher 3) is somewhere between 1080p and 900p, and 20-30 fps and 25-30+ fps. Not all that different. And if you asked most people to tell you which is which (this is true for most games this gen), they couldn't do it. That's the difference.
And this obviously is not the case. I think your perception of the power difference between the two consoles is off, a bit. Case and point, the power difference (for Witcher 3) is somewhere between 1080p and 900p, and 20-30 fps and 25-30+ fps. Not all that different. And if you asked most people to tell you which is which (this is true for most games this gen), they couldn't do it. That's the difference.
I just assume that because the Witcher 3 has a marketing deal with Microsoft they focused harder on the Xbox One version.
It always seems to be those games which are relatively better on Xbox One, and this one is no exception.
Well what? Then your assumption would be correct. Doesn't change the fact that you are trying to pass off an assumption as fact.Whatever, guys! If CDPR fix framerate issues trough patches on PS4 without touching any other graphical elements, well...
Everything we've just said has gone in one ear and out the other, hasn't it? Ugh.Its not the case due to the developer screwing it up, nothing else.
Enough for 1080p? Seems like you are assuming there aren't needed of others resources except ROPs for 1080p.
Just a bit of thinking to you... and if what is holding the framerate in 1080p is the CPU and the GPU is running below what it can? Now change CPU for every other component/resource found in a system.
CPU is bottleneck for framerate in a lot of games and I'm not saying it is for The Witcher 3.The CPU has nothing to do with shading pixels and pushing them to the screen.
I'm not one to tell someone how to develop a game, but PS4 would benefit from 900p..just my opinion. At least dynamic scaling.
Hell no. Higher bandwidth, better GPU in every possible way, over 500 GFLOPS difference and just 'tiny' 180p makes the difference in framerate in favor of weaker console? That's hilarious!
The CPU has nothing to do with shading pixels and pushing them to the screen.
All we hear about from developers now is how they want to make the best possible versions on each platform they release on.
And then we get a version that performs significantly worse on one platform which is significantly more powerful than the other.
I stand by my comment. If one version gets less attention to performance than the others than thats laziness with regards to that platform. Otherwise its incompetence.
Just a measly 180p difference. Come on son. What's your answer then? CDPR hates PS4? Or the nefarious MS paid them off?
If they get rid of the double buffering and 20fps cap with the PS4 version, I'd be more than happy with it. In all honesty the frame rate is better than most open world games were last gen, and if it weren't for just a few areas in the game, the frame rate would be acceptable. If it drops below 30fps in a more tangible way, just let it drop to whatever it needs to, instead of auto locking to an even lower frame rate. The latter method is baffling.
Everything we've just said has gone in one ear and out the other, hasn't it? Ugh.
CPU is bottleneck for framerate in a lot of games and I'm not saying it is for The Witcher 3.
What about draw calls? Isn't that somewhat CPU dependant. Legit question. I know nothing like John Snow.
Well a frame rate that is capped at 30fps is preferable to one that fluctuates between 30-35 if digital foundry is to be believed so shouldn't the same logic apply to framerates below 30?
Incompetence on their part, or ignorance on yours? I think the latter.
PS4 version is running at 1920 x 1080. If they were both running 900p and Xbox One had better performance, you'd have a small point. TW3 is a pretty amazing game, you're forgetting all of the amazing content and the amazing design, structure and pacing and you're simply focusing on a single aspect to call them lazy, lazy effort in gathering facts before posting is more like it. Simply insulting to the hard work the devs put in.
CPU is bottleneck for framerate in a lot of games and I'm not saying it is for The Witcher 3.
As I posted earlier, the fact that they spent a lot of time and work on the size of the world/content/story doesnt mean that they did spend enough time and resources getting all platforms performing well.
If one platform was neglected performance wise than that is indeed the fault of the developers.
I wonder how people would react if this game was developed in this state by a less beloved developer like for example Ubisoft. I bet a lot more of the blame would be directed at the devs if that were case.
First 180p is 44% difference... not tiny.
Second GPU performance isn't linear scale.
Third, there are others thing that can hold performance too like CPU... if CPU is a bottleneck then even a 100% powerful GPU wont give you a good performance boost.