• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Horizon Forbidden West PC vs PS5: Enhanced Features, Performance Tests + Image Quality Boosts!

Bojji

Member
It genuinely doesn't to me and I've just watched it on an 85" panel. I expected a bigger leap tbh. However, if you have a top tier GPU and can run it natively at 4k with high frame rates, that would be great.

You don't need high end GPU to have better image quality than PS5 version in 60FPS mode, something mid tier will be enough.

Even 1800p with DLSS performance (internal 900p) produces better results than PS5 version upscaling (dynamic 1800p CB).
 

Gaiff

SBI’s Resident Gaslighter
Jeez, those threads always turn into shits.

Both sides suck. PS5 players get too defensive and PC gang act condescending as always.

Can't we do better and just be happy and play fucking games?
No, this isn’t a both sides are wrong problem. The PS guys are clearly being a bunch of assholes here. We’re trying to engage in a normal discussion and they turn it into platform warring nonsense.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
Can't we do better and just be happy and play fucking games?

Will Smith Reaction GIF



Im confused....i thought DS was gpu decompression?

Via DF

This is aided by DirectStorage, though GPU decompression isn't enabled. At present, the developers feel this feature is too restrictive in terms of mixing and matching decompression formats and isn't quite the finished article. We're told that as Horizon isn't heavily CPU-limited, it makes sense to run decompression on the CPU rather than adding additional burden to the GPU.

Either way, the frame-time issues we saw with GPU-based DirectStorage in Ratchet and Clank are not visible here
 
Last edited:

OverHeat

« generous god »
No, this isn’t a both sides are wrong problem. The PS guys are clearly being a bunch of assholes here. We’re trying to engage in a normal discussion and they turn it into platform warring nonsense.
I see the ban hammer falling on some lol
hMjAl8T.png
 

CGNoire

Member
Will Smith Reaction GIF





Via DF

This is aided by DirectStorage, though GPU decompression isn't enabled. At present, the developers feel this feature is too restrictive in terms of mixing and matching decompression formats and isn't quite the finished article. We're told that as Horizon isn't heavily CPU-limited, it makes sense to run decompression on the CPU rather than adding additional burden to the GPU.

Either way, the frame-time issues we saw with GPU-based DirectStorage in Ratchet and Clank are not visible here
I guess Im still lost about what exactly Direct Storage is then?
 

Fabieter

Member
Jeez, those threads always turn into shits.

Both sides suck. PS5 players get too defensive and PC gang act condescending as always.

Can't we do better and just be happy and play fucking games?

And both sides will tell you they are in the right like some narcissistic people who can't give in to a mistake.
 

rodrigolfp

Haptic Gamepads 4 Life
Jeez, those threads always turn into shits.

Both sides suck. PS5 players get too defensive and PC gang act condescending as always.

Can't we do better and just be happy and play fucking games?
People want to have the best possible experience with a game. Is that shocking?
 

King Dazzar

Member
You don't need high end GPU to have better image quality than PS5 version in 60FPS mode, something mid tier will be enough.

Even 1800p with DLSS performance (internal 900p) produces better results than PS5 version upscaling (dynamic 1800p CB).
Well it doesnt look "far superior" in the comparison takes. But I'm sure on the higher end, running at 4k with very high framerates, it would be a more noticeable step up. Or the youtube video isn't doing the differences justice.
 

Bojji

Member
Well it doesnt look "far superior" in the comparison takes. But I'm sure on the higher end, running at 4k with very high framerates, it would be a more noticeable step up. Or the youtube video isn't doing the differences justice.

I see big difference with image stability with DLSS, PS5 version is fizzling on some details.
 

bitbydeath

Member
Alex states in the video and I state in the OP that the graphics on PC aren’t much better. Yet you have the usual trolls shitting up the thread.
It’s the old case of Blood in the water.
People get riled up and defensive so people naturally start to tease them.

I don’t know why people can’t just enjoy tech, no matter the side.
 

rodrigolfp

Haptic Gamepads 4 Life
It’s the old case of Blood in the water.
People get riled up and defensive so people naturally start to tease them.

I don’t know why people can’t just enjoy tech, no matter the side.
Defensive how? The game is better and cheaper on PC. What has to be defended?
 
Last edited:
Not that big of a difference on PC. I did double dip with HZD but since I played HFW on PS5, i dont think Ill double dip again. Maybe when its super cheap and the third one is on the horizon, no pun intended, maybe Ill replay it.
 
Last edited:

ChiefDada

Gold Member
"Ryzen 5 3600 is CPU limited above 60fps"

Soooo.... can we stop saying this CPU is universally stronger than PS5 in real world terms NOT SPEC SHEET.

"1800p DLSS w/ DRS lower bounds of 900p is needed to get a 'basically locked' 60fps on a 3070.

Soooo can we stop saying the PS5 Pro will perform like a 3070 "at best" (you know who you are)?
 

Gaiff

SBI’s Resident Gaslighter
"Ryzen 5 3600 is CPU limited above 60fps"

Soooo.... can we stop saying this CPU is universally stronger than PS5 in real world terms NOT SPEC SHEET.

"1800p DLSS w/ DRS lower bounds of 900p is needed to get a 'basically locked' 60fps on a 3070.

Soooo can we stop saying the PS5 Pro will perform like a 3070 "at best" (you know who you are)?
Did he mention the lower bound on the 3070? I don’t recall that.
 

ChiefDada

Gold Member
Did he mention the lower bound on the 3070? I don’t recall that.

He said all of the DLSS modes have lower bounds equal to performance mode of internal render which would be 720p for 2070S (1440p DLSS) and 900p for 3070 (1800p dlss)
 

yamaci17

Member
Did he mention the lower bound on the 3070? I don’t recall that.
I'm sure this game is also tanking performance on the 3070 a lot based on the VRAM overflow. it is quite a shame really. I've ran into similar extreme GPU bottlenecks in ratchet and had to cover up the VRAM performance tank with more aggresive internal resolutions even at 1440p

I will be able to test it tomorrow hopefully. I'm sure it is another case of this:



think of it like this, to cover up the performance loss due to very high textures, you would probably reduce the resolution to 1440p in the above example...
 
Last edited:

ChiefDada

Gold Member
I'm sure this game is also tanking performance on the 3070 a lot based on the VRAM overflow. it is quite a shame really. I've ran into similar extreme GPU bottlenecks in ratchet and had to cover up the VRAM performance tank with more aggresive internal resolutions even at 1440p

I will be able to test it tomorrow hopefully. I'm sure it is another case of this:



Strictly talking about GPU drops where GPU load is maxed out as Alex mentioned. I was sure to pass on bringing up memory bound drops as we have gone to war way too many times over this lol.
 

Gaiff

SBI’s Resident Gaslighter
He said all of the DLSS modes have lower bounds equal to performance mode of internal render which would be 720p for 2070S (1440p DLSS) and 900p for 3070 (1800p dlss)
He said it can potentially drop to that level. Not that it does. That’s the res floor. Doesn’t mean it actually ever hits it. It might.

He’s got another video with optimized settings so maybe he’ll do a count there. Whatever the case, 1800 is apparently a bit too much for the VRAM of the 3070 at times.
 

yamaci17

Member
Strictly talking about GPU drops where GPU load is maxed out as Alex mentioned. I was sure to pass on bringing up memory bound drops as we have gone to war way too many times over this lol.
memory bound drops also manifest themselves with max GPU load. only way to properly identify them is to take a look at shared VRAM usage which alex is refusing to do so for some reason

it is why 3070 also performs greatly lower than ps5 at equivalent settings/resolutions because it is offloading 4.5 gb worth of memory to regular RAM. this is something PC games should not do, and instead do proper texture streaming. in most other cases, even breaching past 1 gb of shared memory will cause insane performance problems and you stop getting your GPU's worth of performance at that point

5E86D5V.png


Fsc4S5M.png



if a game uses more than 1-1.5 gb shared memory, it means that you're not getting optimal GPU performance you can get. simple as that. alan wake 2, avatar and most UE5 titles properly stream textures and never go above 1.5 gb shared memory and as such, in those games these GPUs perform like you would expect.

this war will go on because some of you people use this to opportunistically claim PS5 is punching above its weight when you know that all along an 16 GB 3070 would drastically change the result. hence the idea that ps5 performing better than the "3070" chip is just disingenious when the chip is being stalled because of poor memory management decisions by the developer.

only way to truly know if 3070 truly needs 900p internal res to hit 60 fps is to reduce textures to low or use the 16 gb 4060ti. even medium textures will put some strain on these cards as I had to use low textuers in ratchet and clank to get proper GPU bound performance out of my 3070
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
memory bound drops also manifest themselves with max GPU load. only way to properly identify them is to take a look at shared VRAM usage which alex is refusing to do so for some reason

it is why 3070 also performs greatly lower than ps5 at equivalent settings/resolutions because it is offloading 4.5 gb worth of memory to regular RAM. this is something PC games should not do, and instead do proper texture streaming. in most other cases, even breaching past 1 gb of shared memory will cause insane performance problems and you stop getting your GPU's worth of performance at that point

5E86D5V.png


Fsc4S5M.png



if a game uses more than 1-1.5 gb shared memory, it means that you're not getting optimal GPU performance you can get. simple as that. alan wake 2, avatar and most UE5 titles properly stream textures and never go above 1.5 gb shared memory and as such, in those games these GPUs perform like you would expect.

this war will go on because some of you people use this to opportunistically claim PS5 is punching above its weight when you know that all along an 16 GB 3070 would drastically change the result. hence the idea that ps5 performing better than the "3070" chip is just disingenious when the chip is being stalled because of poor memory management decisions by the developer.
I think this can be somewhat mitigated by lowering the texture streaming speed from Fastest down to Fast.
 

yamaci17

Member
I think this can be somewhat mitigated by lowering the texture streaming speed from Fastest down to Fast.
this is with that setting on "normal" lol

I will surely do tests in forbidden west comparing low med and high textures and see if they affect the performance or not. I'm sure nixxes are proud with their solution and they will use it in every port of theirs. in hindsight, it is useful because you can at least have a chance at higher textures as long as you have performance to spare. but of course it then becomes fuel for people to claim ps5 is performing like 3070 so I'm conflicted on this behaviour. I'd still prefer if they just streamed textures at distance. because then we have to deal with these people saying 3070 needs 900p internal to hit 60 FPS. I'm sure without VRAM bottlenecks, 4060ti 16 gb will perform much better than 3070 in this game.

technically nothing stops Alex from comparing 8 GB 4060ti to 16 GB 4060ti and prove that VRAM is the real problem there. But I'm sure he won't do that as well.
 
Last edited:

King Dazzar

Member
I don't know how good your eyesight is, if you can't see details even 4k stream and big ass tv won't help you see the difference.

I played burning shores in 40fps mode because performance mode looked bad (still much better than launch version).
You taking the piss out of my old eyes? I bet you go up to wheelchair users and ask them to race you too! You springy legged, chicken.
Tv Show Thinking GIF by Laff
 

Bojji

Member
Compared to what? It has one of the best performance mode IQ from any game, DLSS aside. I only place Ragnarok ahead in console realm.

Compared to 40fps mode, it's too low res and at the same time over sharpen to my liking. That's why some loss of fluidity was worth much better image quality.

40fps modes should be in every game, I won't play in 30fps (unless there is absolutely no choice...) but 40fps is fine and in most games difference in IQ is worth it.
 
Top Bottom