What evidence is there that Witcher 3, in the scenes where it drops below 30 FPS on console, is not constrained by one of the many factors which would be alleviated by dropping the resolution (memory bandwidth, fillrate, shader compute resources, ...)?Weird when we have evidence that it is not the cause of the frame rate and other technical issues as you suggested.
What evidence is there that Witcher 3, in the scenes where it drops below 30 FPS on console, is not constrained by one of the many factors which would be alleviated by dropping the resolution (memory bandwidth, fillrate, shader compute resources, ...)?
What amazes me if the people attitude of "should be 900p".
No, this is the easy way for developers and could set a precedent, when developers choose lowering the resolution instead of work on optimization.
Of course, the major reason for the average FPS differences between the versions right now is the fact that the PS4 apparently only double buffers its rendering.
I'll happily go on record to say that CDPR did a great job. I haven't seen anything comparable run comparably or better on these platforms. I'd also happily go on record to say that it would run better on faster hardware, but I don't need to because we can all see that on PC.You want to go record and say CDPR has done great job and it just because of the hardware?
The only advantage I could think of is that you automatically (without extra timing) get better framepacing -- as long as you don't drop any frames.Why would they opt to use double buffered rendering on the PS4 and not on the Xbox One? Does double buffering have any advantages over triple buffering besides slightly less memory consumption?
Other than cut scenes the frame rate during gameplay is pretty much identical and PS4 runs at higher resolution.
I'll happily go on record to say that CDPR did a great job. I haven't seen anything comparable run comparably or better on these platforms. I'd also happily go on record to say that it would run better on faster hardware, but I don't need to because we can all see that on PC.
Obviously I won't say that it's "just" because of the hardware, because no complex software ever uses a modern hardware platform to the full extent theoretically possible.
The only advantage I could think of is that you automatically (without extra timing) get better framepacing -- as long as you don't drop any frames.
Is it CGI cutscenes or in game conversations?
CDPR are doing the best they can with the hardware at their disposal.
Maybe they needed to show more demanding areas because these sub-30fps drops during gameplay are not big at all. It held to like 29fps majority of the time
They appear to be in-engine cutscenes or whatever and they are not interactive.
Maybe they needed to show more demanding areas because these sub-30fps drops during gameplay are not big at all. It held to like 29fps majority of the time
Somehow resolution being the problem has become the go to suggestion. Weird when we have evidence that it is not the cause of the frame rate and other technical issues as you suggested. Add Dying Light to the list.
Look at this texture/texture lod travesty to the right. PS4 side.
Now look at the same texture or building further down, now, observe the texture to the top left on the adjacent building. PS4 side.
Where is the guy on the podium? PS4 side.
Where is he, still further down that alley..smh. PS4 side.
Oh, here is the red suit guy...finally. PS4 side, what's up with the fire though?
Loading issues seems to affect the PS4 more, I see the PS4 losing the odd frame a bit more just horseback riding as it loads new areas. This is definitely bad optimization on CDPR's part. So many issues with this game, it's really a bad conversion.
Lowering Resolution can fix the problem, but it should really be a last resort, when nothing else can fix the game.
What amazes me if the people attitude of "should be 900p".
No, this is the easy way for developers and could set a precedent, when developers choose lowering the resolution instead of work on optimization.
They said Xbox One was easier to program for(to them) and now we witness the results.
Pretty sure they noted themselves they found XB1 easier to program for and optimize due to DirectX familiarity. IIRC in same interview they noted PS4 had better memory design from programming point of view but XB1 was just overall more familiar environment due to their PC roots.IS ther erelaly that much difference though? I'm 1/2 thinking that comment was just lip service BS 'cause they were payign the marketing bills.
In fact, isn't the PS4 inherently easier to program for, given there's no need to worry about some ddr3 hardware bottleneck?
Or maybe it's an API issue, since I'm guessing the Xbone is not too dissimilar from DX11 boilerplate.
I do not understand why it is double buffered. Why would they do that?
If it is double buffered, any drop at all should drop to 20 fps. They don't need to drop the res to get rid of that kind of drop.
That would be my guess. PC as lead platform, with XB1 development more straightforward than PS4 development due to DX.Or maybe it's an API issue, since I'm guessing the Xbone is not too dissimilar from DX11 boilerplate.
It is. Gamersyde has a framerate analysis where it shows the game running consistently at 20fps in the swamp.It's not double-buffered, since DF's own videos show the framedrops don't make the framerate plummet all the way to 20fps.
That would be my guess. PC as lead platform, with XB1 development more straightforward than PS4 development due to DX.
It is. Gamersyde has a framerate analysis where it shows the game running consistently at 20fps in the swamp.
I could wrong but everytime I read your post it's just about how terrible weak are console, all the fault it's just there which it's partially true because things like optimisation always existed. However I think it's quite clear no needs to repeat console are weak ad infinitum. In any case game runs quite well on ps4, they need just to fix the part with 20 fps locked which is really annoying. They could low LOD in such scenario and use soft vsync. No one will dead with some tearing, well if we are not talking about of the same level of the first Uncharted.I think it's what people just need to learn to expect. It's like buying a mid range car, and expecting it to perform like a Ferrari. Then blaming the company that makes the roads
Exactly, this is just a bad port to the PS4. More and more it seems that the PS4 didn't get the attention it deserves. If this game is properly optimized on the PS4 I can see enhancements to textures, loadtimes and lod without any need for a resolution drop. It is clear as day based on all the discrepancies shown that this game was not ready to release on consoles ( especially PS4). I see nothing in Witcher 3 where 1080 30fps cannot be a locked and a stable affair at all times, even in the swamps, but it seems many games are just thrown out to retail regardless these days.What amazes me if the people attitude of "should be 900p".
No, this is the easy way for developers and could set a precedent, when developers choose lowering the resolution instead of work on optimization.
DF mentioned it's double buffered indeed. It just happened in the cutscene for them but it seems, when fps drops for too much, ps4 stay locked to 20 even in the gameplay. Of course, I'm not expect miracle but unlock fps could give a less stuttering impact.It's not double-buffered, since DF's own videos show that framedrops don't make the framerate plummet all the way to 20fps.
They said Xbox One was easier to program for(to them) and now we witness the results.
DF mentioned it's double buffered indeed. It just happened in the cutscene for them but it seems, when fps drops for too much, ps4 stay locked to 20 even in the gameplay.
Lazy devs at it again. smh
I think they both do great but the ps4 users need to not lash out and allow devs to drop that 1080p necessity
In all honesty I don't get this tech choice. Even on x360 there is tearing why they preferred a so terrible stuttering fps over a soft vsync solution, it's out of my mind.Just saw the GS video Seanspeed mentioned. Shit's awful. Makes me feel better about my PC purchase last year.
They said Xbox One was easier to program for(to them) and now we witness the results.
This is so insulting. Nothing about a game like TW3 is lazy.
It's just easier with a more powerful hardware to have better performance. That's it. No lazyness here.If one version has gotten less attention in terms of optimization than the others than it absolutely applies.
Otherwise its just incompetence.
https://twitter.com/Marcin360/status/604905169768300544@81chrisso @witchergame @MilezZx the camera stutter is getting a fix in the next patch.
So..
@81chrisso @witchergame @MilezZx the camera stutter is getting a fix in the next patch.
https://twitter.com/Marcin360/status/604905169768300544
Did you read about the framerate? hope they fix it in the next console update. As a Ps4 owner it sucks big time. Also that xp bug...these games must be played months after release in order to enjoy them fully.
Yes.But are they referring to the PS4 version at all? What about the data streaming issues in the PS4?
What amazes me if the people attitude of "should be 900p".
No, this is the easy way for developers and could set a precedent, when developers choose lowering the resolution instead of work on optimization.
If one version has gotten less attention in terms of optimization than the others than it absolutely applies.
Otherwise its just incompetence.
It's just easier with a more powerful hardware to have better performance. That's it. No lazyness here.
In all honesty I don't get this tech choice. Even on x360 there is tearing why they preferred a so terrible stuttering fps over a soft vsync solution, it's out of my mind.
That is neither laziness, nor incompetence.
People here act like there is no priority in things we do. I'm sure with infinite resources and time all these things would be hammered out in one shot, but that is not how reality works.