Where are you getting 8xAF from? The texture filtering looks pretty bad to me.
Oh lawd.. That's looks like 2x.
Where are you getting 8xAF from? The texture filtering looks pretty bad to me.
HereWhere are you getting 8xAF from? The texture filtering looks pretty bad to me.
Comparison shots reveal very poor AF on both consoles (8x on PS4, 4x on Xbox One).
Where are you getting 8xAF from? The texture filtering looks pretty bad to me.
That is not optimization lol
Optimization is when you get the a software and make it runs better without any sacrifices... there is a lot of that in gaming or any other software... just a change in the code can make some part runs way better than the old code.
That is optimizations... the hardware and software quality won't change...
There is no magic... it is hard work and time.You're right. Optimization involves magic and metal
Here
Could be dynamic though. That shot looks way worse than most of what I've see. While playing.
not really nice hyperbole thoughReally sad state of affairs for the console versions. It's like we're already back where we were in 2012 or so with the 360/PS3 versions of game in relation to PC versions on moderate hardware.
Such a disappointing generation thus far outside of Bloodborne (which also has terrible performance issues) for me.
Cap to 20FPS what the hell are they thinking here?
The cutscenes run like shit for me on PC too. Same thing with DA:I. Game runs and looks great, then framerate goes to shit as soon as a cutscene starts.
There is no magic... it is hard work and time.
They were thinking : you console peasant deserve nothing more than 20fps. Don't buy our game we only spent a few years working on it. It's just a massive open world game your are entitled to trash my game when cutscenes drop to lower framrates and gameplay is mostly at 30 FPS.Cap to 20FPS what the hell are they thinking here?
Here
Could be dynamic though. That shot looks way worse than most of what I've see. While playing.
I find that extremely disrespectful to CDPR. They delivered a masterpiece.Game developers aren't interested in doing any of that.
They were thinking : you console peasant deserve nothing more than 20fps. Don't buy our game we only spent a few years working on it. It's just a massive open world game your are entitled to trash my game when cutscenes drop to lower framrates and gameplay is mostly at 30 FPS.
It's not an explicit cap so much as just double-buffered vsync being double-buffered vsync. The plus side is that when it's dropping like this it's displaying each frame for an even 50ms; although the very framerate is low, it's not juddery per se. The sixth gen actually convinced me it's not a bad option for vsync'd 30fps games; I'd very often rather drop to 20fps for a second or two than deal with something flopping about the mid-20s.My thoughts exactly.
Will not buy are my other thoughts.
Game developers aren't interested in doing any of that.
Game developers aren't interested in doing any of that.
Imagine a high skilled developer from CDPR reading this...Game developers aren't interested in doing any of that.
Game developers aren't interested in doing any of that.
Game developers aren't interested in doing any of that.
Why lock the dips to 20 fps? I've never heard of that being done before.
Not true. These are hard working folk who face serious crunch hours near a game's launch. If you've worked on a game in any capacity, you'll realise how gruelling it can be.Game developers aren't interested in doing any of that.
not really nice hyperbole though
If this is 8xAF then the Witcher 3's world is the correct answer.In what world is 8xAF poor?
If this is 8xAF then the Witcher 3's world is the correct answer.
In what world is 8xAF poor?
The game is 99% at 30fp, why sacrifice 30% of the pixels and picture clarity for a 1% increase in fps. ( CDPR can iron the issue via a patch)
I would agree with your statement if PS4 version was averaging at 15fps which is not
In what world is 8xAF poor?
Really sad state of affairs for the console versions. It's like we're already back where we were in 2012 or so with the 360/PS3 versions of game in relation to PC versions on moderate hardware.
Such a disappointing generation thus far outside of Bloodborne (which also has terrible performance issues) for me.
Going by that logic then why not have the PS4 @720p, master of versions!!
If the framerate goes up on XB1 when dropping the resolution it should do it on PS4 too, and I'd gladly take a patch with a new graphics option to drop the resolution to 900p or even 720p to get a better framerate, 20fps is just horrible. :/I'm impressed at how well the Xbox One runs Witcher. I wonder if it would benefit the PS4 to run it at 900p.
Yikes.
Yeah, no way this is 8xAF.
well pc has stuttering issues in cutscenes too and if you take an equivalent pc like i5 750ti it performs around same ballpark as ps4 so no im not seeing itIt really seems like it to me the more I see these Digital Foundry threads.
Not only do you have lower quality assets and poor image quality as the norm, but it seems like frequent dips below 30 fps is to be expected in most games. At least we're getting a lot of 1080p (or at least 900p) for most games, I suppose.
Force it to X16 with the Nvidia control panel.
lolSo the game looks better on one console than the other. But not really. Or maybe a little.
Stay away from this game dude. Your dreams will run at 20 fps...not supporting games that run this terrible.
guess i have to wait until i upgrade my pc to play this.
I'm impressed at how well the Xbox One runs Witcher. I wonder if it would benefit the PS4 to run it at 900p.
well pc has stuttering issues in cutscenes too and if you take an equivalent pc like i5 750ti it performs around same ballpark as ps4 so no im not seeing it
well why dont you just comqre it to a 2k~3k machine,no point in going half way therePC has stuttering issues in cutscenes? I've only played for about 4 hours, but I certainly haven't had any. Maybe it's because I'm running with a G-Sync monitor? I don't know. The game has performed amazingly for me so far.
Also, I didn't mean to compare it to "equivalent hardware," I was just talking about "moderate hardware." Pretty much a machine that would fall within the $800~$1,000 range.