• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Marvel’s Spider-Man Remastered has released on Steam/Steam Deck | Digital Foundry Breakdown Review Released

AFMichel

Neo Member
This is a great port and runs great even on my ancient PC albeit with an RTX 3080 graphics card. The only things missing are a benchmark mode and an option to limit the game's framerate (I have had to use RTSS for that for now). There is an option to target certain framerates such as 60 fps but this does not limit the framerate to 60 fps on my G-SYNC monitor and instead runs uncapped unless I use RTSS.

Also, HBAO+ is embarrassingly broken in this game despite NVIDIA having released a game-ready driver for this game (v516.94) so would think they would have noticed this during testing? Apparently not because SSAO looks miles (morales!) better as HBAO+ doesn't look it is working at all on my PC as cars lack ambient shadowing underneath that shows when you use SSAO.
I capped it to 60 through the nvidia system preferences, I have a 3080ti and it ran between 90 and 130 fps all maxed out.
 
So it seems disabling multi threading on your CPU increases performance by up to 15% on any processor 6 cores or more. Like I said earlier. Seems silly to bench this game in its current state. There are obvious performance issues.
 

yamaci17

Member
So it seems disabling multi threading on your CPU increases performance by up to 15% on any processor 6 cores or more. Like I said earlier. Seems silly to bench this game in its current state. There are obvious performance issues.
that's some funny stuff lol
needs a hotfix asap
but do you think smt on would surpass smt off? or would it just... match it? to me it seems like the game is specifically written for 6-7 threads since it targets on ps4/ps4 pro. i also really wonder if this game utilizes SMT on PS5 or not? what if it also uses 8/8 smt off config on PS5?
 

that's some funny stuff lol
needs a hotfix asap
but do you think smt on would surpass smt off? or would it just... match it? to me it seems like the game is specifically written for 6-7 threads since it targets on ps4/ps4 pro. i also really wonder if this game utilizes SMT on PS5 or not? what if it also uses 8/8 smt off config on PS5?
I have a feeling it will just match it, but that seems to be enough honestly. 60+ all the time is more then reasonable for a game like this. That is unless they also optimize the bvh stuff. I'm which case I'd imagine more gains.
 

yamaci17

Member
Interesting.


yes, i noted this before. ps5 flys way above what rtx 2070super is capable of. it practically performs near 3060ti or just a bit below in terms of pure ray tracing performance when you match fidelity mode settings. it definetely packs more punch than 2070/2070 super, which is impressive

either those settings are not a proper match, or ps5 packs more punch than its worth in this title. this is why alex ignored drawing comparisons to PS5, only compared graphical settings. usually he says "see, with RT ps5 performs like a 2060s" or "see, it only matches a 2060!!". notice how he purposefully avoided doing that. because in this game, fidelity mode at 4k, ps5 performs like a 3060ti with ray tracing. this is not a joke, this is my personal findings as well.

i dont like that dude but this game is the perfect opportunity for his promoting PS5 thing
 
Last edited:

ACESHIGH

Member
I don't understand why nixes the so called goats of PC ports didn't play to the strenghts of PC hardware when putting this version of the game together.

Forza horizon has a shader compilation screen that you see on first boot or when updating drivers. That makes all the difference in the world and performance is smooth across all configs.

Not sure why devs insist on compiling shaders on the fly on pc. Such an idiotic decision for the HW we currently have.

NX Gamer makes Briank75 look like a MS fanboy though.
 
Last edited:
yes, i noted this before. ps5 flys way above what rtx 2070super is capable of. it practically performs near 3060ti or just a bit below in terms of pure ray tracing performance when you match fidelity mode settings. it definetely packs more punch than 2070/2070 super, which is impressive

either those settings are not a proper match, or ps5 packs more punch than its worth in this title. this is why alex ignored drawing comparisons to PS5, only compared graphical settings. usually he says "see, with RT ps5 performs like a 2060s" or "see, it only matches a 2060!!". notice how he purposefully avoided doing that. because in this game, fidelity mode at 4k, ps5 performs like a 3060ti with ray tracing. this is not a joke, this is my personal findings as well.

i dont like that dude but this game is the perfect opportunity for his promoting PS5 thing
Ugh you really making me want to get the game to confirm.
 

rodrigolfp

Member
yes, i noted this before. ps5 flys way above what rtx 2070super is capable of. it practically performs near 3060ti or just a bit below in terms of pure ray tracing performance when you match fidelity mode settings. it definetely packs more punch than 2070/2070 super, which is impressive

either those settings are not a proper match, or ps5 packs more punch than its worth in this title. this is why alex ignored drawing comparisons to PS5, only compared graphical settings. usually he says "see, with RT ps5 performs like a 2060s" or "see, it only matches a 2060!!". notice how he purposefully avoided doing that. because in this game, fidelity mode at 4k, ps5 performs like a 3060ti with ray tracing. this is not a joke, this is my personal findings as well.

i dont like that dude but this game is the perfect opportunity for his promoting PS5 thing
...or the pc is not performing how it could/should.
I don't understand why nixes the so called goats of PC ports didn't play to the strenghts of PC hardware when putting this version of the game together.

Forza horizon has a shader compilation screen that you see on first boot or when updating drivers. That makes all the difference in the world and performance is smooth across all configs.

Not sure why devs insist on compiling shaders on the fly on pc. Such an idiotic decision for the HW we currently have.

NX Gamer makes Briank75 look like a MS fanboy though.
Spider-man doesn't have stuttering from shader compilations. At least not that I noticed.
 

yamaci17

Member
Ugh you really making me want to get the game to confirm.
we need a person with 3060ti/3070 and 5600x/11600k to make sure CPU is not any kind of bottleneck. his tests will be met by question marks, since he will most likely uses his old zen just like me

in my case, i can clearly see GPU is at %99 usage. I still wonder however if pipeline can react differently with a better CPU. i will try the same native 4k/fidelity/matched thing with smt off, since I confirmed I too get a CPU perf. increase. I will test the same place and see if I still get the similar GPU bound performance profile

again though, doesn't look too hot. at %99 gpu usage native 4k with fidelity matched settings, clearly 3070 is choking out around 50-60 FPS. considering ps5 is able to get 45-50 fps or so there, I'd expect my 3070 to reliably push above 70, considering what I've seen from other Ray Tracing titles. its really an alarming difference.
 
Last edited:
we need a person with 3060ti/3070 and 5600x/11600k to make sure CPU is not any kind of bottleneck. his tests will be met by question marks, since he will most likely uses his old zen just like me

in my case, i can clearly see GPU is at %99 usage. I still wonder however if pipeline can react differently with a better CPU. i will try the same native 4k/fidelity/matched thing with smt off, since I confirmed I too get a CPU perf. increase. I will test the same place and see if I still get the similar GPU bound performance profile

again though, doesn't look too hot. at %99 gpu usage native 4k with fidelity matched settings, clearly 3070 is choking out around 50-60 FPS. considering ps5 is able to get 45-50 fps or so there, I'd expect my 3070 to reliably push above 70, considering what I've seen from other Ray Tracing titles. its really an alarming difference.
I broke down. Got a good deal on a key.
just started the game but these were from the things nxg put up. PS5 fidelity settings.
 

Fafalada

Fafracer forever
If Spiderman Remastered, which is streaming anywhere from 25mb-200mb from storage is heavy on PC CPUs, what would CPU performance for ~2gb+ streaming requirements look like?
If you know how much 200MB/s takes, 2GB/s would be linear scaling of that. From what I recall - decoding LZ derivatives would basically eat up 12-16 cores to keep up with 2GB/s input.
 

yamaci17

Member
I broke down. Got a good deal on a key.
just started the game but these were from the things nxg put up. PS5 fidelity settings.
damn, so I got it right, I guess? ps5 really is near 3060ti in this game... jeez. horrific stuff. let's hope its not a continued thing. i wonder if its planned obsolence by nvidia, somehow
 

BeardGawd

Gold Member
damn, so I got it right, I guess? ps5 really is near 3060ti in this game... jeez. horrific stuff. let's hope its not a continued thing. i wonder if its planned obsolence by nvidia, somehow
Why would Sony spend the time to maximize performance on someone else's platform?

MS on the other hand uses the same api for both xbox and pc so it's easier.
 

Zathalus

Member
damn, so I got it right, I guess? ps5 really is near 3060ti in this game... jeez. horrific stuff. let's hope its not a continued thing. i wonder if its planned obsolence by nvidia, somehow
I think it's just because the game heavily favors AMD with the RT implementation. The 6700 XT performs the same as the 3060 ti which is not usually the case with RT.
 

yamaci17

Member
Why would Sony spend the time to maximize performance on someone else's platform?

MS on the other hand uses the same api for both xbox and pc so it's easier.
well all I know that it will be a heavy slap to the people who kept saying PS5 can only match a 2060/2060s when it comes to ray tracing
this is huge, it is really funny and actually miserable that "Alex" failed to mentioned this in his "detailed" RT review where he compared PS5 to PC hardware.
 
damn, so I got it right, I guess? ps5 really is near 3060ti in this game... jeez. horrific stuff. let's hope its not a continued thing. i wonder if its planned obsolence by nvidia, somehow
Yeah it seems like a fine port, but there are still things that seem off. With decent RT 30 series should beat 6000 series quite handily, but it's not here. I'm betting it's more a case of "good enough" and nothing sinister.
 
well all I know that it will be a heavy slap to the people who kept saying PS5 can only match a 2060/2060s when it comes to ray tracing
this is huge, it is really funny and actually miserable that "Alex" failed to mentioned this in his "detailed" RT review where he compared PS5 to PC hardware.
Think it's more of a well optimized for ps5 engine and the port is just powering thru instead of getting it's own optimization. I mean it's only using 7gb VRAM and 7gb RAM. Doesn't seem to go much above that when it could be very useful to use more if you have it. I got 20gb of RAM just sitting there.
 

yamaci17

Member
Think it's more of a well optimized for ps5 engine and the port is just powering thru instead of getting it's own optimization. I mean it's only using 7gb VRAM and 7gb RAM. Doesn't seem to go much above that when it could be very useful to use more if you have it. I got 20gb of RAM just sitting there.
nope, this game only utilizes 6.2-6.3 gb on 8 gb buffer. u have 700-800 mb of background stuff. check per-game dedicated VRAM usage, you will it never goes above 6.2. not because it does not need it, it simply caps out at there.

i think i cracked the code, if nx gamer did his test with very high textures, our 8 GB GPUs are heavily bottlenecked by VRAM at 4k with vhigh textures. would explain the enormous performance drop in some scenes for him

per any chance, what texture setting have you used? alex states ps5 uses very high textures on his video, but put high textures in his eurogamer article

i have no idea why game behaves this way. normally, a game decides to use normal memory for GPU operations when you're out of VRAM.

this game however starts to chug normal memory once GPU memory utilization reaches to 6-6.2 GB. naturally, GPU has to wait for data to be send back and forth between RAM and VRAM. the more data it transfers from PCI-e, the higher the penalty.

very high textures - 44 fps
gpu memory used = 5.36 GB.
shared memory usage = 3.34 GB.
so game uses 3.34 gb of CPU memory for GPU operations which causes a huge performance slowdown. (probably why nx gamer's 2070 tanked if he used very high textures)
notice that PCIE data transfer is at 8 gb/s

high textures - 51 fps.
gpu memory used = 5.37 GB.
shared memory usage = 2.02 GB
see the pattern? less data into the normal memory, higher the GPU framerate. BUT. total system VRAM usage is at 6.8 GB. so there's clearly a whopping total of nearly 1.2-1.3 GB available for game to use. instead, it decides to chug into normal memory instead.
notice that PCIE data transfer is at 1.4 GB/s

very low textures - 56 fps (%28 more than ps4's 44 fps)
gpu memory used = 5.37 GB
shared memory usage = 0.73 GB
game still decides to use normal memory somehow despite there being a whopping 2 GB free GPU memory
PCIE data transfer is at 1 GB/s

so there's something awry going on. i think 8 GB GPUs are seriously hamstrung by game's logic in allocationg memory. it aggresively starts using system memory once it hits 5.3-6 GB GPU memory which a behaviour I've not seen in any other AAA game yet
 
Last edited:
nope, this game only utilizes 6.2-6.3 gb on 8 gb buffer. u have 700-800 mb of background stuff. check per-game dedicated VRAM usage, you will it never goes above 6.2. not because it does not need it, it simply caps out at there.

i think i cracked the code, if nx gamer did his test with very high textures, our 8 GB GPUs are heavily bottlenecked by VRAM at 4k with vhigh textures. would explain the enormous performance drop in some scenes for him

per any chance, what texture setting have you used? alex states ps5 uses very high textures on his video, but put high textures in his eurogamer article
I used settings posted in op for fidelity. Going to actually do my play thru at 1440p highest settings I can keep above 60(few dips are fine)
 

yamaci17

Member
I used settings posted in op for fidelity. Going to actually do my play thru at 1440p highest settings I can keep above 60(few dips are fine)
yes, now replay that scene with very high textures. If I'm correct, you too will have a huge performance downfall. in his video, he shows very high textures for all PS5 modes. i think the eurogamer chart is wrong in that aspect.



I'm now fairly sure that NX gamer used very high textures referencing this bit in the video. As you can see, very high textures tanks my performance to 44 FPS from 51 FPS. sure as hell, it will tank performance heavily for 2070s too.

by the way, ps5 clearly uses very high textures. actually, ps4 uses high textures, it impossible that ps4 uses medium textures, i set the game to medium and oopening cutscene, almost nothing is readable. i opened base ps4 intro again, and everything is readable and fine textured. base ps4 matches high textuers/ps5 matches very high textures

i guess at 1440p you and me can get away with very high textures. but 4k seems like a big nope.
 
Last edited:
yes, now replay that scene with very high textures. If I'm correct, you too will have a huge performance downfall. in his video, he shows very high textures for all PS5 modes. i think the eurogamer chart is wrong in that aspect.



I'm now fairly sure that NX gamer used very high textures referencing this bit in the video. As you can see, very high textures tanks my performance to 44 FPS from 51 FPS. sure as hell, it will tank performance heavily for 2070s too.

by the way, ps5 clearly uses very high textures. actually, ps4 uses high textures, it impossible that ps4 uses medium textures, i set the game to medium and oopening cutscene, almost nothing is readable. i opened base ps4 intro again, and everything is readable and fine textured. base ps4 matches high textuers/ps5 matches very high textures

i guess at 1440p you and me can get away with very high textures. but 4k seems like a big nope.
At 1440p and higher settings with very high textures the game runs great outside of cpu limited parts. Don't feel like disabling SMT though so I'll hope for a patch.
 
I think it's just because the game heavily favors AMD with the RT implementation. The 6700 XT performs the same as the 3060 ti which is not usually the case with RT.
Probably because for once the game RT was optimized for AMD implementation. It would also mean PS5 game (with RT) performs the same as a 6700 XT.
 
Last edited:

yamaci17

Member
At 1440p and higher settings with very high textures the game runs great outside of cpu limited parts. Don't feel like disabling SMT though so I'll hope for a patch.
yeah but vram issue has to be fixed. they're most likely "thinking" ahead of us by giving us a vram "headroom" so background apps can function

but guess what, I don't have any. and the game still refuses to use anything over 6.2 GB. There's no logic when the GPU has free 1.9 GB memory that game still decides to offload 4.5 GBs worth of memory into normal RAM and cause slowdowns





its just isn't fun when you're limited to a 6.2 GB VRAM pool when you have a 8 GB card. Let your engine dynamically handle it loads depending on free memory available.

Forza Horizon 5 does a similar thing where if your VRAM amount breaches past 6.4 GB, you start to get rainbow color textures, so you have to use high textures again (not maxed out). I don't know, it doesn't make sense to me. I've clearly seen upwards of 7.5 GB VRAM usage on Cyberpunk 2077 and Halo Infinite and they ran perfectly fine with maximum textures.

If this mentality of "we will only use %80 of your available VRAM and you will like it" I will have to sell this one and get myself a 16 GB GPU or something.

Now it makes sense too, even 3080/3080ti users are having VRAM related frame drops after an hour or so of playtime. %80 rule make it so that if you have a 10 gig card, stupid game will only use 8 GB of it, and if you have a 12 GB card, it will only use 9.6 GB of it.

I'm purely talking about the VRAM the game uses, not the background stuff. As you can see it can be easily reduced by the user. Just reduce 500-800 mb from every system, most systems will easily have 500-800mb bloat especially if you leave hardware acceleration on on applications /which I do not.

however game does not even respect whether you have background load or not. it just assumes everyone will by default have %20 of their VRAM filled. as you can see however, my idle system barely uses anything over 200 mb, which does not even make up to a %5 of my total budget.

only reason I banked on 8 GB to survive a coupel more years was to have minimal bloat in the background and have free VRAM resources most people wouldn't have. Yet now I'm being limited by assumptions made by stupid developers. Rightfully so, I'm very angry.

Its unavailable that not a single reviewer picked this up.
 
Last edited:
yeah but vram issue has to be fixed. they're most likely "thinking" ahead of us by giving us a vram "headroom" so background apps can function

but guess what, I don't have any. and the game still refuses to use anything over 6.2 GB. There's no logic when the GPU has free 1.9 GB memory that game still decides to offload 4.5 GBs worth of memory into normal RAM and cause slowdowns





its just isn't fun when you're limited to a 6.2 GB VRAM pool when you have a 8 GB card. Let your engine dynamically handle it loads depending on free memory available.

Forza Horizon 5 does a similar thing where if your VRAM amount breaches past 6.4 GB, you start to get rainbow color textures, so you have to use high textures again (not maxed out). I don't know, it doesn't make sense to me. I've clearly seen upwards of 7.5 GB VRAM usage on Cyberpunk 2077 and Halo Infinite and they ran perfectly fine with maximum textures.

If this mentality of "we will only use %80 of your available VRAM and you will like it" I will have to sell this one and get myself a 16 GB GPU or something.

Now it makes sense too, even 3080/3080ti users are having VRAM related frame drops after an hour or so of playtime. %80 rule make it so that if you have a 10 gig card, stupid game will only use 8 GB of it, and if you have a 12 GB card, it will only use 9.6 GB of it.

I'm purely talking about the VRAM the game uses, not the background stuff. As you can see it can be easily reduced by the user. Just reduce 500-800 mb from every system, most systems will easily have 500-800mb bloat especially if you leave hardware acceleration on on applications /which I do not.

however game does not even respect whether you have background load or not. it just assumes everyone will by default have %20 of their VRAM filled. as you can see however, my idle system barely uses anything over 200 mb, which does not even make up to a %5 of my total budget.

only reason I banked on 8 GB to survive a coupel more years was to have minimal bloat in the background and have free VRAM resources most people wouldn't have. Yet now I'm being limited by assumptions made by stupid developers. Rightfully so, I'm very angry.

Its unavailable that not a single reviewer picked this up.
I feel like with the SMT thing plus this plus RTX cards "underperforming" really puts a blemish on this port. Was hoping for better from nixies.
 

yamaci17

Member
I feel like with the SMT thing plus this plus RTX cards "underperforming" really puts a blemish on this port. Was hoping for better from nixies.
i don't even now think that most cards are underperforming, most likely most of the GPUs out there are under heavy VRAM pressure, 3080 included


check this Raytracing benchmark.

at 1440p, rtx 3080 outperforms the 2080ti by %20 percent (it should be %30-35, in normal conditions).

but looking at 4k benchmarks, 3080 only and ONLY outperforms the 2080ti by a puny %7 margin.

and look at that, 2080ti outperforms the 3070 by %25 too. guess what, VRAM

and suddenly, 3070 is near 6700xt levels of performance at 4K.

I'm sure we would run into the similar performance penalty at 1440p too, after some play time. I just opened the game at 1440p, and i still get the 6. 2 GB VRAM thing and game casually offloads 2.3 GB worth of GPU data into the normal RAM.

the fact that 4k+dlss pushes 3080 to be close to 2080ti is just bad. , game simply refuses to use all available VRAM.
 
Last edited:

ACESHIGH

Member
Yeah the port has some work ahead. Not as perfect as DF mentioned. CPU hog, poor VRAM allocation, sharpened like crazy, texture flickering.

I hate how DF seems to be in bed with these Sony devs. They are too fast to claim that their ports are great because they run decently on their overpowered hardware. To be a great port the game has to play to the PC strenghts and this game does not. Same as God of war which had like 10 patches and still not fixed.

Days gone was the best port. HZD is great after fixes but only because there was a lot of backlash. I recommend posting these findings on the steam forum DF does not give a crap. Those guys settle for 1440p on a 3080 and claim it's a great port FFS.
 
Last edited:

yamaci17

Member
Yeah the port has some work ahead. Not as perfect as DF mentioned. CPU hog, poor VRAM allocation, sharpened like crazy, texture flickering.

I hate how DF seems to be in bed with these Sony devs. They are too fast to claim that their ports are great because they run decently on their overpowered hardware. To be a great port the game has to play to the PC strenghts and this game does not. Same as God of war which had like 10 patches and still not fixed.

Days gone was the best port. HZD is great after fixes but only because there was a lot of backlash. I recommend posting these findings on the steam forum DF does not give a crap. These guys settle for 1440p on a 3080 and claim it's a great port FFS.
haven't even mentioned LOD bug yet. game uses degraded LODs for all resolutions

LODs are so badly tuned that 1080p, some textures refuse to load:


look at this sweater... no texture




this is not even a DLSS/FSR issue. once your render resolution drops exactly below 1250p, game engine will stop loading proper textures for lots of things (not every thing) this is why 1440p+dlss quality is degraded texture gameplay. you have to stay at native 1440p. native 1080p is also hugely affected by the issue. yet base ps4 runs at native 1080p and sweater loads perfectly fine

how convenient that DF failed to see these kind of quirks too
 

bbr1979

Banned
The silicon for the I/O block was designed to decompress 5.5 GB/s worth of data. That amount of bandwidth was designed for instant load times, and to eliminate data streaming bottlenecks in game designs. Will games actually saturate that amount of bandwidth? I don't see that happening but I'm happy to be proven wrong.

Mark Cerny even mentioned that developers only asked for a minimum of 1 GB/s of read and write speed from disk. I think that number is what most "next-gen" games including third parties will be designed around and DirectStorage paired with an SSD with at least 1 GB/s will be enough to handle next-gen loading and streaming on PC, although I'm sure many GPU's will comfortably handle much higher speeds if necessary.
For now ps5 have better streaming quality than any pc config. Also direct storage currently supports only software mode. Also developers of Avatar said they planning stream up to 2gb in second only textures on new consoles.
 

Stuart360

Member
we need a person with 3060ti/3070 and 5600x/11600k to make sure CPU is not any kind of bottleneck. his tests will be met by question marks, since he will most likely uses his old zen just like me

in my case, i can clearly see GPU is at %99 usage. I still wonder however if pipeline can react differently with a better CPU. i will try the same native 4k/fidelity/matched thing with smt off, since I confirmed I too get a CPU perf. increase. I will test the same place and see if I still get the similar GPU bound performance profile

again though, doesn't look too hot. at %99 gpu usage native 4k with fidelity matched settings, clearly 3070 is choking out around 50-60 FPS. considering ps5 is able to get 45-50 fps or so there, I'd expect my 3070 to reliably push above 70, considering what I've seen from other Ray Tracing titles. its really an alarming difference.
A benchmark guy i watch has done a lot of Spiderman vids with various gpu's, all using a 5900x cpu. He checks different resolutions, settings, and RT on and off. They maybe helpful for you -

 

yamaci17

Member
A benchmark guy i watch has done a lot of Spiderman vids with various gpu's, all using a 5900x cpu. He checks different resolutions, settings, and RT on and off. They maybe helpful for you -

problem is not beyond bottleneck
8 gb gpus run into all sorts of vram/ram limitations at 1440p and beyond with ray tracing enabled and very high textures

and high textures, at times, look worse than ps4 so its not an option either


for now i settled on maxed o
ut ray tracing with very high textures at native 1080p. playing 1440p and above causes huge VRAM bound bottlenecks. setting the textures to High is a fix but you get worse-than-PS4 textures most of the time. since dlss/fsr still uses 1440p/4k assets, they still incur heavy memory consumption


also, high textures + high reflections will give you weird reflections...








using "very high geometry" EVEN with low textures fixes the problem;



so, takeaways;

1) very high geometry has better reflections than ps5 regardless of texture setting, whether you have textures at low or very high

2) high textures+high geometry has worse reflections than ps5 (by huge margin). high geometry practically needs very high textures to properly create reflections

3) high geometry only produces ps5 equivalent reflections when paired with very high textures

and at 1440p and beyond, it is impossible (with how the game manages memory allocation) to fit high geometry+very high textres. you can fit very high geometry+high textures at 1440p, but then you will have to live with textures:

ps4 textures



very high textures




"so called" high textures which Alex suggests for " 6 GB GPUs". You also need to use "high textures" at 1440p and above with 8 GB cards to enable ray tracing without having VRAM bound performance drops






TL;DR
- Game only uses maximum of %80 VRAM (aside from background apps)
- You need very high textures to ensure you get high quality textures that are both intended for PS4 and PS5. High textures are not a MATCH for PS4 textures. They bundled improved PS5 textures and oldgen PS4 textures with Very High texture settings. If you use High textures, you're getting even worse textures than PS4. be careful
- 8 GB VRAM GPUs, with how the game handles memory allocation, is not enough at 1440p and above to both run ray tracing and Very High textures. Only at 1080p I managed to run Ray tracing with Very High textures without having enormous slow downs.
- Slow downs are caused by excess memory spilling into normal RAM, which causes constant data transfer over PCI-e. This does not happen when you have enough VRAM or play at lower resolutions.

In short, it impossible to match PS5 equivalent perforance due to VRAM bottleneckes 8 GB cards experience at 1440p and above, and since PS5 constantly runs around 1440p at 60 FPS modes.

VRAM bottleneck is real, RTX 3060 can almost match PS5 ray tracing performance at native 4K. 3070/3070 ti cannot. At native 4K, VRAM situation becomes so drastic that 3070/3070ti drops frames below 30s with very high textures. Only way to play at native 4k with a 3070 like PS5 does is to use high textures, which as proven above, degrades the overall texture quality quite heavily.


Here at native 4k with nearly ps5 equivalent RT settings, 3060 is able to get 40+ FPS (until it too is affected by VRAM bottleneck, and yes, it does too)

and here 3070 with similar RT settings;

drops below 40, underperforms, even compared to 3060

I know it has been a long post but due to how the game handles memory allocation, and due to how bad high textures look, for now it is impossible to enjoy this game at maximum fidelity at 1440p with an 8 GB card. All cards severely underperform, cause performance slowdowns, excess PCIE transfers.
 

ChiefDada

Member
For now ps5 have better streaming quality than any pc config. Also direct storage currently supports only software mode. Also developers of Avatar said they planning stream up to 2gb in second only textures on new consoles.

It's surprising to me that many people didn't expect PC to have memory-related performance issues for next gen console ports. Although I am a bit surprised that Spiderman Remastered is having such a significant impact; current gen games will only become larger with more demanding and complex data management requirements. More system memory isn't the great solution PC enthusiast thought it would be.
 

Stuart360

Member
It's surprising to me that many people didn't expect PC to have memory-related performance issues for next gen console ports. Although I am a bit surprised that Spiderman Remastered is having such a significant impact; current gen games will only become larger with more demanding and complex data management requirements. More system memory isn't the great solution PC enthusiast thought it would be.
Not really, check the benchmark vids i posted above, way higher fps, resolution, and settings than PS5 if yuo have the hardware. Hell i cam max out the game at 4k/60 with my 3700x and 1080ti, no RT of course.
And the gap will only widen from here on in.
 

Stuart360

Member
problem is not beyond bottleneck
8 gb gpus run into all sorts of vram/ram limitations at 1440p and beyond with ray tracing enabled and very high textures

and high textures, at times, look worse than ps4 so its not an option either


for now i settled on maxed o
ut ray tracing with very high textures at native 1080p. playing 1440p and above causes huge VRAM bound bottlenecks. setting the textures to High is a fix but you get worse-than-PS4 textures most of the time. since dlss/fsr still uses 1440p/4k assets, they still incur heavy memory consumption


also, high textures + high reflections will give you weird reflections...








using "very high geometry" EVEN with low textures fixes the problem;



so, takeaways;

1) very high geometry has better reflections than ps5 regardless of texture setting, whether you have textures at low or very high

2) high textures+high geometry has worse reflections than ps5 (by huge margin). high geometry practically needs very high textures to properly create reflections

3) high geometry only produces ps5 equivalent reflections when paired with very high textures

and at 1440p and beyond, it is impossible (with how the game manages memory allocation) to fit high geometry+very high textres. you can fit very high geometry+high textures at 1440p, but then you will have to live with textures:

ps4 textures



very high textures




"so called" high textures which Alex suggests for " 6 GB GPUs". You also need to use "high textures" at 1440p and above with 8 GB cards to enable ray tracing without having VRAM bound performance drops






TL;DR
- Game only uses maximum of %80 VRAM (aside from background apps)
- You need very high textures to ensure you get high quality textures that are both intended for PS4 and PS5. High textures are not a MATCH for PS4 textures. They bundled improved PS5 textures and oldgen PS4 textures with Very High texture settings. If you use High textures, you're getting even worse textures than PS4. be careful
- 8 GB VRAM GPUs, with how the game handles memory allocation, is not enough at 1440p and above to both run ray tracing and Very High textures. Only at 1080p I managed to run Ray tracing with Very High textures without having enormous slow downs.
- Slow downs are caused by excess memory spilling into normal RAM, which causes constant data transfer over PCI-e. This does not happen when you have enough VRAM or play at lower resolutions.

In short, it impossible to match PS5 equivalent perforance due to VRAM bottleneckes 8 GB cards experience at 1440p and above, and since PS5 constantly runs around 1440p at 60 FPS modes.

VRAM bottleneck is real, RTX 3060 can almost match PS5 ray tracing performance at native 4K. 3070/3070 ti cannot. At native 4K, VRAM situation becomes so drastic that 3070/3070ti drops frames below 30s with very high textures. Only way to play at native 4k with a 3070 like PS5 does is to use high textures, which as proven above, degrades the overall texture quality quite heavily.


Here at native 4k with nearly ps5 equivalent RT settings, 3060 is able to get 40+ FPS (until it too is affected by VRAM bottleneck, and yes, it does too)

and here 3070 with similar RT settings;

drops below 40, underperforms, even compared to 3060

I know it has been a long post but due to how the game handles memory allocation, and due to how bad high textures look, for now it is impossible to enjoy this game at maximum fidelity at 1440p with an 8 GB card. All cards severely underperform, cause performance slowdowns, excess PCIE transfers.
I think the vram/ram problems maybe be soley RT related. I say that because on my 1080ti (which doesnt have RT of course), vram hangs around 6.7gb at 1440p max settings, 7-8gb at 4k max settings, and system ram is always in the 8-10gb range on both resolutions.
 

yamaci17

Member
I think the vram/ram problems maybe be soley RT related. I say that because on my 1080ti (which doesnt have RT of course), vram hangs around 6.7gb at 1440p max settings, 7-8gb at 4k max settings, and system ram is always in the 8-10gb range on both resolutions.
it is, but doesn't change the fact that 3070 can't stay on par with PS5 when RT is enabled at native 4k or at 4k with upsampling (dlss, or fsr, or IGTI).

there you said it, game uses 6.7 gb without RT on your end. game is unable to use past 6.4 GB on 8 GB GPUs regardless if you have RT on or not. It artificially caps all GPUs at %80 vram utilization (SPECIFICALLY game's vram usage. not TOTAL usage)

enable per-app VRAM allocation in afterburner overlay. push the game to its maximum (6k maybe). you will see that it will never breach past 8.8 GB. Once it gets near 8.8 GB, it will retract back to somewhat close to 8 GB and start using normal RAM instead.

once again, I'm not talking about total VRAM usage. make sure to disable all potential applications that may use VRAM (discord, browser etc.)
 

ChiefDada

Member
Not really, check the benchmark vids i posted above, way higher fps, resolution, and settings than PS5 if yuo have the hardware. Hell i cam max out the game at 4k/60 with my 3700x and 1080ti, no RT of course.
And the gap will only widen from here on in.

Right, no RT; are we aiming for apples to apples or not? Also the 3070, you know the one with the VRAM limitation, is curiously missing from the analysis.

And again, this is Spiderman-Remastered, a last gen game at it's core with max ~200mb/s streaming requirement. You're right, the gap will widen and PC VRAM requirements will be increasing; even then something like 16gb still won't be enough for power differential to match performance delta when current gen console streaming will be moving large amounts of data moment to moment.
 

yamaci17

Member
Right, no RT; are we aiming for apples to apples or not? Also the 3070, you know the one with the VRAM limitation, is curiously missing from the analysis.

And again, this is Spiderman-Remastered, a last gen game at it's core with max ~200mb/s streaming requirement. You're right, the gap will widen and PC VRAM requirements will be increasing; even then something like 16gb still won't be enough for power differential to match performance delta when current gen console streaming will be moving large amounts of data moment to moment.
no you miss the point. having ps equivalent free budget (around 10 gb) is enough to run at ps5 equivalent rt settings. 3060 has this (12 gb*0.80 = 9.6 gb) and easily handles RT even at native 4K

problem is created by Nixxes by artificially locking all GPUs to a maximum of %80 of their VRAM. even 3080 is performing near a 2080ti at 4k due to being artificially hamstrung to only 8 gb budget by Nixxes. I've reported this to them 5 times, tweeted to Alex but no one seems to care or notice

having a 7.5-8 GB budget at 1440p would be enough to use ray tracing and maximum textures easily, being limited to 6.4 gb is severely limiting capabilities of all 8 GB RTX GPUs. this issue is created by Nixxes themselves

spiderman remastered on PS5 is doing nothing fancy about SSD. the dreams you people have that SSDs will constantly dump huge amounts of data in and out of memory is nothing but a dream and has nothing to do with this situation. SSD is and never will be a substitute for VRAM. A GPU that has 10-12 GB VRAM should last the entire gen matching PS5 settings.

I can run Cyberpunk 2077 at 4k, dlss balanced with RT GI, reflections and shadows with VRAM utilization around 7.7 GB with fine performance with maximum textures with a gorgeous looking RT GI'ed game. I can't even enable RT without running into VRAM bottlenecks in this game at 1440p, let alone 4K. You gotta understand this, having those 1.5-2 GB extra budget greatly helped 8 GB GPUs in recent 2 years.
 
Last edited:

yamaci17

Member
Well the PS5 version runs at 1440p doesnt it?, and around 40-50fps.

ps5 can do upwards of stable 80+ FPS at 1440p


at native, no bullshit 4k, ps5 can do 45+ frames most of the time with VRR enabled with ray tracing. no dynamic resolution involved.

I've literally linked you 3060 doing native 4k+ps5 equivalent ray tracing with playable framerates



see how 3060 gets nearly PS5 level of performance (this is perfectly normal, PS5 has better raster performance than 3060, 3060 has better RT performance than PS5, so they mostly perform nearly the same)

what I say to you is, 3070/3070ti is enable to achieve what 3060 can here. even at 1440p, you eventually run into the exact same VRAM bottleneck, after 5-15 minutes of playtime.
 
Last edited:

Stuart360

Member
see how 3060 gets nearly PS5 level of performance (this is perfectly normal, PS5 has better raster performance than 3060, 3060 has better RT performance than PS5, so they mostly perform nearly the same)

what I say to you is, 3070/3070ti is enable to achieve what 3060 can here. even at 1440p, you eventually run into the exact same VRAM bottleneck, after 5-15 minutes of playtime.
Right i think i'm getting you. You're saying the ram problems are basically holding back the 3070, and you should be getting way better results than the 3060 and PS5 but you arent?.

Well hopefully these problems get patched. All the problems with HZD and GOW got patched in the end.
 

ChiefDada

Member
no you miss the point. having ps equivalent free budget (around 10 gb) is enough to run at ps5 equivalent rt settings. 3060 has this (12 gb*0.80 = 9.6 gb) and easily handles RT even at native 4K

spiderman remastered on PS5 is doing nothing fancy about SSD.

And you missed MY point, even though you mentioned it with the bolded. The combination of high textures and RT is pushing VRAM requirements.

Spiderman Remastered is doing nothing fancy with streaming at 200mb. The CPU asset decompression is minimal in the context of current gen, yet it is causing performance to not be in step with hardware differential. To my knowledge, this is the first PS5 only game with RT that has been ported to PC so it makes sense this is the first time we are seeing such a disparity. Miles Morales is up next and if I'm not mistaken the textures and/or geometry was slightly higher in that game.

the dreams you people have that SSDs will constantly dump huge amounts of data in and out of memory is nothing but a dream and has nothing to do with this situation.

Nope, Ratchet and Clank is doing this right now. And I will bookmark this conversation and come back when it releases for PC to prove that a similar performance/hardware disparity will exist.
 

Stuart360

Member
And you missed MY point, even though you mentioned it with the bolded. The combination of high textures and RT is pushing VRAM requirements.

Spiderman Remastered is doing nothing fancy with streaming at 200mb. The CPU asset decompression is minimal in the context of current gen, yet it is causing performance to not be in step with hardware differential. To my knowledge, this is the first PS5 only game with RT that has been ported to PC so it makes sense this is the first time we are seeing such a disparity. Miles Morales is up next and if I'm not mistaken the textures and/or geometry was slightly higher in that game.



Nope, Ratchet and Clank is doing this right now. And I will bookmark this conversation and come back when it releases for PC to prove that a similar performance/hardware disparity will exist.
Dude even the Matrix demo, the most 'next gen' thing we have so far, only streams around 500mb/sec accorfing to the devs, and will work on a SATA SSD, and it does as i tried it.

This will only end in tears dude, i thought we were past all this by now.
 

yamaci17

Member
Right i think i'm getting you. You're saying the ram problems are basically holding back the 3070, and you should be getting way better results than the 3060 and PS5 but you arent?.

Well hopefully these problems get patched. All the problems with HZD and GOW got patched in the end.
yes. when targeting at 4k, I can understand 8 GB not being able to match 10 GB budget PS5 has. but at 1440p, 8 GB budget should be more than enough.

see here;


you remember the video above, where 3060 was handling native 4k with ray tracing fine, always getting upwards of 30+ frames like ps5 does?

see this, clearly runs into a VRAM bottleneck. gpu memory usage shows as 9 GB (you can be sure that 600-800 mb of it is background apps). if my theory is correct, game limits itself to 8 GB on an 10 GB card. so 9 GB total usage means game is nearing 8 GB of vram consumption. once it gets there, it retracts itself back to 7.4 GB. notice how the game ignores to use available VRAM from that point on. this is simply not a perfrmance bottleneck, it is simply a VRAM bottleneck. when the card has freaking 3GB memory available for use. you can bet exact same settings would run at enormously higher framerates with the 12 gb version.

exact same thing happens to me. once it gets past 6.4 gb of usage, game retracts back to 5.4 GB usage and never ever decides to using VRAM again. you can see the exact behaviour on the 3080, once it gets past 8 GB (total 9 GB), it retracted back to 7.4 GB of usage and never went up from that value. the game simply has a quirky weird memory allocation policy which is hurting these GPUs that are on the brink of having "just enough" memory to run things.

"And you missed MY point, even though you mentioned it with the bolded. The combination of high textures and RT is pushing VRAM requirements."

I've told you, it has not. when matched ps5 settings, game only needs 10 GB VRAM at 4K, which is what PS5 can allocate to this game at maximum, logically. you need exactly what tha game needs on a PS5. only problem is that Nixxes decided to cap everyone at %80 usage for their own sake (so that people can run endless chrome tabs without moaning about the game is slowing down their desktop)
 
Last edited:

yamaci17

Member
here's the situation. very high textures + ps5 equivalent settings:
in this scene, VRAM bound bottleneck happened. frames dropped.

go check how many people say they experience "frame drops on streets". they're not even aware they're running out of VRAM (artificially limited by nixxes for unknown reasons). you can clearly observe total maximum vram used is 6.6 GB. a whopping 1.6 GB is available for use

drop textures and bam. %80 performance is clawed back. this is at 1440p. now imagine the situation at 4K.






mind you, even the 68 FPS here is not the true "performance" of what is 3070 capable of. you can still see excess 2.4 GB of memory EVEN with low textures. that excess memory is causing performance drops.

despite having an open 2gb memory available here, game still refuses to use and use 2.36 gb of normal memory instead

you can see that gpu power is barely tapped. it only uses 130-160 watts. this gpu is rated for 220 watts.
 
Last edited:
Top Bottom