• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Witcher 3 PS4 patch 1.50 adds Pro support out now

AndyB1974

Member
Some quick comparisons with v1.30. Playing on a 1080p monitor.

v1.30
thewitcher3v1301sobe5.png


v1.50
thewitcher3v1501jzx5o.png


v1.30
thewitcher3v1302wyais.png


v1.50
thewitcher3v15024rl3u.png


v1.30
thewitcher3v1303nyyoe.png


v1.50
thewitcher3v150394xul.png


v1.30
thewitcher3v130407l16.png


v1.50
thewitcher3v1504pkaho.png


v1.30
thewitcher3v1305xlbsq.png


v1.50
thewitcher3v1505e1yq5.png
 
I feel the game is already showing its age with the visuals, so it's nice to see it get this update. I'm especially curious how it's going to fare on the One X, though I don't expect much of a deviation from the Pro version.

Yeah I can agree with that. There are parts that look great and others not so much. Imagine if they had brought the visuals those early screenshots showed, oh my. But that was never going to happen.

Doesn't Blood and Wine still look damn good though? Never played it but the footage looked great.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Some quick comparisons with v1.30. Playing on a 1080p monitor.

Thanks. Among other things ... looking at the second set of screens, you can see the thin reeds/branches in the middle are completely smoothed out on the 1.50 shots while they look like a jagged mess on the 1.30 one's.

edit: and the comparison you edited in to the post.

Nice.
 

Kamina

Golden Boy
So compared to PC, what graphics level is the PS4 version at? Mid range?
Maybe i’ll double dip...

@HDR: does it even require any processing power?
 
- 60FPS wasn't realistic here and won't happen on the X1X either.
- Native 4K wasn't all that realistic either given the scope of the game but it might be possible on the X.
- No HDR is disappointing but the game naturally has fantastic art with vibrant colors so it's fine.

I don't know when we all agreed that 60FPS, 4K on a game like The Witcher 3 was possible and expected but we need to walk down our expectations.
 

Tovarisc

Member
So do people asking for HDR / being disappointed that CDPR didn't implement it into TW3 even know what goes into implementing that into the already finished and few year old game?

I sure hell don't so any info how easy or hard process it's would be appreciated. I have seen comments here and there that you need basically design your engine and content pipeline from ground up for HDR support.
 
I feel like there's a difference in AO.

IIRC, like CA, TW3 scales all post processing with resolution, so if they did not mess with AO darkening and radius for the default SSAO, it could look "lessened" at a higher res.

I do not remember if that applies to HBAO+ though, probably. I remember Jim2Point complaining about it because it makes Ansel / Ancel a bit useless on PC unless you want little to no post processing (and AO is pretty necessary post processing IMO).
 
Yeah I can agree with that. There are parts that look great and others not so much. Imagine if they had brought the visuals those early screenshots showed, oh my. But that was never going to happen.

Doesn't Blood and Wine still look damn good though? Never played it but the footage looked great.
I think BaW does look generally better than the rest of the game, but not to an obvious extent.
 

Tinúviel

Member
I finished the main game on my PC at release even though I have a GTX1080 I bought GOTY edition for PS4 this summer just because I can't stop using debug menu on pc version. -_-

I left Blood and Wine just when I was at this point of the game because of the patch news.


I'm using a 1440p monitor which obviously downsamples to 1080p.I can confirm that the difference is huge non-patch version's biggest issue for me was the lack of AF and AA.Now the IQ looks really clean.

I also bought LG 27UD67-W I'm sure it'll look stunning on 4K.
 

Kalentan

Member
I haven't played in a while but it looks really sharp on my 4K LG TV.

I'll probably actually finally beat it now.

Edit: Huh, I haven't played since patch 1.20
 

Md Ray

Member
So compared to PC, what graphics level is the PS4 version at? Mid range?
Maybe i'll double dip...

@HDR: does it even require any processing power?

Resolution: 1920x1080 (4K via cb or geo on Pro)
Nvidia HairWorks: Off
Number of Background Characters: Low
Shadow Quality: Medium
Terrain Quality: Medium
Water Quality: High
Grass Density: Medium
Texture Quality: Ultra High
Foliage Visibility Range: High
Detail Level: Medium
Ambient Occlusion: SSAO
All post-process effects on, except vignetting
 

VeeP

Member
They said they would release it when it's done. What's interesting about it?

People probably thought Microsoft paid CDPR to hold it back until the X1X version was ready.

Anyways, I just have a PS4 no Pro, so I won't be getting this update :(, enjoy guys!
 
Resolution: 1920x1080 (4K via cb or geo on Pro)
Nvidia HairWorks: Off
Number of Background Characters: Low
Shadow Quality: Medium
Terrain Quality: Medium
Water Quality: High
Grass Density: Medium
Texture Quality: Ultra
Foliage Visibility Range: High
Detail Level: Medium
Ambient Occlusion: SSAO
All post-process effects on, except vignetting

Texture quality on PS4 is not ultra, as that adds in mip LOD bias that is much "sharper" than default PS4. It would be somewhere inbetween medium and high.
 

BigDug13

Member
Didn't the boost mode make some of those really rough FPS areas playable? Does this patch eliminate those bonuses? Couldn't care less about the visual upgrades if they've downgraded the frames per second to achieve it. Kinda focusing on the wrong thing here.
 

pswii60

Member
Is the framerate and overall jankiness improved? I only tried it at launch and the performance on PS4 was unbearable for me, although I appreciate it's been patched a few times since then.
 
Guys, I can confirm that we are getting 4K on the edges!

Thanks to Tinúviel and his perfect screenshots, I was able to pixel count and get a definite, beautiful, 2160p pixel count!

Here is the image if you want to count it yourself

YsuJfWh.png


My line is 30 pixels up

The math, for people unaware looks like this:

Draw 30 pixels up from a pixel "staircase"

Skip a pixel up and across like I did in the photo

Continue drawing a line sideways, until you hit another staircase

Count each staircase between the sections you marked

In our case, it happens to be 30 staircases

Now, you divide the staircases you counted from the amount of pixels you drew upwards at the start so:

30/30= 1

then you take that number and multiply it by the resolution of the photo

1 x 2160= 2160
 

adamsapple

Or is it just one of Phil's balls in my throat?
Didn't the boost mode make some of those really rough FPS areas playable? Does this patch eliminate those bonuses? Couldn't care less about the visual upgrades if they've downgraded the frames per second to achieve it. Kinda focusing on the wrong thing here.

Even before the boost mode, the GoTY version (and base with all the patches) was pretty stable. Some Bog area fights would have small drops but the game was a huge improvement over launch by last year.

With this Pro patch, CDPR have noted there's a "slight boost to performance" so it's only gonna be better.

http://wccftech.com/the-witcher-3-ps4-pro/

Yes, the game has just received an upgrade patch, enabling it to take advantage of the additional power offered by the PS4 Pro. When playing the game on a PS4 Pro system, The Witcher 3: Wild Hunt and all its additional content feature support for 4K resolution and a slight boost to performance.
 

Md Ray

Member
Texture quality on PS4 is not ultra, as that adds in mip LOD bias that is much "sharper" than default PS4. It would be somewhere inbetween medium and high.

I thought High and Ultra textures were same... From GeForce guide:
"On Low, 1024x1024 textures are enabled, though there's also a double dose of Texture, Detail Texture and Atlas Texture downscaling, reducing VRAM usage at the expense of texture detail and clarity. Switching to Medium increases texture detail to 2048x2048, and reduces the downscaling to 1x, marginally improving clarity. On High, downscaling is dropped entirely, revealing max quality 2048x2048 textures, greatly improving image quality.

On Ultra, there are no further improvements to clarity or detail, merely an increase in the memory budget, allowing more textures to be stored in memory at any given time. Running around on foot this makes little difference, but when galloping on horseback it minimizes the chance of encountering the unsightly streaming-in of high-quality textures."


EDIT: You're right, there's a difference between High and Ultra. Just saw the "Update" below.
 
I thought High and Ultra textures were same... From GeForce guide:
"On Low, 1024x1024 textures are enabled, though there's also a double dose of Texture, Detail Texture and Atlas Texture downscaling, reducing VRAM usage at the expense of texture detail and clarity. Switching to Medium increases texture detail to 2048x2048, and reduces the downscaling to 1x, marginally improving clarity. On High, downscaling is dropped entirely, revealing max quality 2048x2048 textures, greatly improving image quality.

On Ultra, there are no further improvements to clarity or detail, merely an increase in the memory budget, allowing more textures to be stored in memory at any given time. Running around on foot this makes little difference, but when galloping on horseback it minimizes the chance of encountering the unsightly streaming-in of high-quality textures."
High and Ultra textures are definitely the same in terms of resolution when the camera is smack-dab right in front of the texture, but the settings for textures also ties in with other variables, namely mipmap bias. From that same NV Guide:
v1.04 introduces TextureMipBias, a new [Rendering] setting that is tied to Texture Quality. If you're interested, the ins and outs of Mipmaps can be discovered here. For everyone else, the cliff notes: mipmaps are lower-resolution versions of textures that are utilized to increase performance and minimize VRAM use. In the case of The Witcher 3: Wild Hunt, mipmaps are aggressively used for the innumerable layers of foliage that can be seen for miles, and for just about every other texture in the game. This improves performance and keeps VRAM requirements down, but results in some lower-quality foliage and surfaces in the immediate vicinity of Geralt.

By reducing the bias, as this new setting does, higher-resolution mipmaps are loaded at near-range, and the rate at which mipmaps are scaled down across distant views is reduced, increasing visible detail considerably. Unfortunately, this also increases aliasing and shimmering on fine detail, such as the mesh shoulder pads of Geralt's starting armor, and the many thin pieces of swaying grass. As the bias is further reduced these problems intensify, though they can be somewhat mitigated by increasing the rendering resolution and by downsampling. Also, make sure to disable sharpening filters, which greatly exacerbate these issues.

Below, we demonstrate the visual impact of TextureMipBias at "0", the v1.03 level, and the level used on Low and Medium Texture Quality settings in v1.04; at "-0.4", the value used for High; at "-1.0", the value used for Ultra; and at "-2.0", our tweaked value. Focus primarily on the left side of the screen, paying particular attention to the thatched roofs, the tree and foliage near to them, and the trees before the river (hold down ctrl and press + to enlarge these sections for a better view).
So on ultra (or even high), textures further into the distance use a lower mip, and thus are higher in apparent resolution. This can of course have problems though and cause extra aliasing if the pixel density is much smaller than texel density, and if the texture has lots of specularity / is an alpha texture with holes that end up being smaller than pixel in size.
 
Didn't the boost mode make some of those really rough FPS areas playable? Does this patch eliminate those bonuses? Couldn't care less about the visual upgrades if they've downgraded the frames per second to achieve it. Kinda focusing on the wrong thing here.
The boost mode didn't unlock the full GPU performance in games that didn't support the pro. So it basically ran 18 GCN cored at 911 MHz instead of the 800 MHz the original PS4 could achieve. The other half of the GPU is still deactivated for stability purposes. The CPU does run at full speed though.
 

Blackthorn

"hello?" "this is vagina"
Didn't the boost mode make some of those really rough FPS areas playable? Does this patch eliminate those bonuses? Couldn't care less about the visual upgrades if they've downgraded the frames per second to achieve it. Kinda focusing on the wrong thing here.
Performance on PS4 had actually been mostly resolved pre-Pro thanks to patches.
 

nOoblet16

Member
I thought High and Ultra textures were same... From GeForce guide:
"On Low, 1024x1024 textures are enabled, though there's also a double dose of Texture, Detail Texture and Atlas Texture downscaling, reducing VRAM usage at the expense of texture detail and clarity. Switching to Medium increases texture detail to 2048x2048, and reduces the downscaling to 1x, marginally improving clarity. On High, downscaling is dropped entirely, revealing max quality 2048x2048 textures, greatly improving image quality.

On Ultra, there are no further improvements to clarity or detail, merely an increase in the memory budget, allowing more textures to be stored in memory at any given time. Running around on foot this makes little difference, but when galloping on horseback it minimizes the chance of encountering the unsightly streaming-in of high-quality textures."

So basically textures are max resolution up close medium but uses mip maps for textures at a distance, whereas high gets rid of mipmapping and it's max resolution all the time and on Ultra the game stores additional texture data in order to avoid streaming transitions ?
 

f@luS

More than a member.
I stopped playing when it was 30 FPS not stable on ps4 normal and was like it’s so good looking but damn at the frame rate. After too many games to play

Now I have a 30 FPS stable with increased resolution. ?! All I asked. Bless this patch. will play soon
 

Schlomo

Member
So basically textures are max resolution up close medium but uses mip maps for textures at a distance, whereas high gets rid of mipmapping and it's max resolution all the time and on Ultra the game stores additional texture data in order to avoid streaming transitions ?

It only changes the bias. Getting rid of mip-mapping would result in shimmering everywhere except close to the camera.
 

coastel

Member
So compared to PC, what graphics level is the PS4 version at? Mid range?
Maybe i’ll double dip...

@HDR: does it even require any processing power?

I'd like to know this as I tried the PC version at 4k with medium settings on a gtx 1060 6gb. Looked great and ran fine....in the starting area. I'm going to see what it looks like on the Pro now with this patch.
 
Ita not just me but there are definite performance drops now where there wasn't before. On boost mode it was pretty much a 30fps lock. Whilst it looks much better now it doesn't hold its frame rate nearly as much I find.
 
So compared to PC, what graphics level is the PS4 version at? Mid range?
Maybe i’ll double dip...

@HDR: does it even require any processing power?

HDR power requirement would be minimal and I think PS4 Pro might even have hardware dedicated to it but its probably much harder to add then simply upping the resolution and telling the game its OK to use all the extra horsepower now. Someone would have to go back and basically retweak the lighting for the entire game.
 

Md Ray

Member
Resolution: 1920x1080 (4K via cb or geo on Pro)
Nvidia HairWorks: Off
Number of Background Characters: Low
Shadow Quality: Medium
Terrain Quality: Medium
Water Quality: High
Grass Density: Medium
Texture Quality: High
Foliage Visibility Range: High
Detail Level: Medium
Ambient Occlusion: SSAO
All post-process effects on, except vignetting

I'd like to know this as I tried the PC version at 4k with medium settings on a gtx 1060 6gb. Looked great and ran fine....in the starting area. I'm going to see what it looks like on the Pro now with this patch.

^As close as possible to PS4 equivalent settings.
 
Guys, I can confirm that we are getting 4K on the edges!

Thanks to Tinúviel and his perfect screenshots, I was able to pixel count and get a definite, beautiful, 2160p pixel count!

My line is 30 pixels up

The math, for people unaware looks like this:

Draw 30 pixels up from a pixel "staircase"

Skip a pixel up and across like I did in the photo

Continue drawing a line sideways, until you hit another staircase

Count each staircase between the sections you marked

In our case, it happens to be 30 staircases

Now, you divide the staircases you counted from the amount of pixels you drew upwards at the start so:

30/30= 1

then you take that number and multiply it by the resolution of the photo

1 x 2160= 2160

Great work. But how is this possible and Dragon's Dogma can't hit 4K? Ouch
 
Thanks thought it was me being over sensitive. Can't ever prioritise image quality over stable frame rate for me so this patch isn't good for me.
 
Top Bottom