• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Marvel's Spider-Man 2 Secrets Revealed: Debug Menu Features Tested

Bojji

Member
You never showed us that video benchmark with your 2.15x claim. Until you can show it, shut the fuck up and stop quoting me. I grow tired of the idiocy of your posts.

It can be higher than 2.5 even. GPU limited places (4080S is like 1% better than 4080), PS5 vs. 4080S:

pi8alCZ.jpg
w8a7ECY.jpg
wemJpsk.jpg
 
It can be higher than 2.5 even. GPU limited places (4080S is like 1% better than 4080), PS5 vs. 4080S:

pi8alCZ.jpg
w8a7ECY.jpg
wemJpsk.jpg
All those games run quite a bit better on series x I feel it makes more sense to compare the 4080 to that there the first game is an NVIDIA favored game see the 2070 super outperforming the 6700 in raster your also linking an rt game when the discussion is on rasterization) for reference the 4080 is more than 4x the rt power of the ps5
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Here it is with a 13600k (which is still kind of lopsided) focus on the 4k part and you will see it never hits above 70

This is the last reply I'll be giving you because frankly, I'll get banned for what I'm about to say next. One, those aren't matched settings, and two, it's one game you fucking know is one of the worst PC ports in the past 5 years. Is your point that the RTX 4080 is 2.15x faster in the worst-case scenario? Cool, because it's 2.5x overall.

All those games run quite a bit better on series x I feel it makes more sense to compare the 4080 to that there the first game is an NVIDIA favored game see the 2070 super outperforming the 6700 in raster your also linking an rt game when the discussion is on rasterization) for reference the 4080 is more than 4x the rt power of the ps5
Aka, they don't match up your bullshit narrative so you cling to one game that runs notoriously terribly on PC to make a hollow point.

Now please, fuck off and stop derailing the thread. This is about Spider-Man and your obsession of following me around just to quote me with bullshit that I shut down every time is annoying. Don't bother answering me because I won't answer you back.
 
Last edited:

Bojji

Member
All those games run quite a bit better on series x I feel it makes more sense to compare the 4080 to that there the first game is an NVIDIA favored game see the 2070 super outperforming the 6700 in raster your also linking an rt game when the discussion is on rasterization) for reference the 4080 is more than 4x the rt power of the ps5

You can exclude CP with RT but you have clear raster comparison of AW2 and PT in GPU limited (graphics modes) places, 4080S is WAY faster than just 2.15x, it's 2.8x and 3x. I don't know that Xbox has anything to do with this comparison.
 

Gaiff

SBI’s Resident Gaslighter
You can exclude CP with RT but you have clear raster comparison of AW2 and PT in GPU limited (graphics modes) places, 4080S is WAY faster than just 2.15x, it's 2.8x and 3x. I don't know that Xbox has anything to do with this comparison.
Don't bother. This kid is mentally deranged. He shows a benchmark of one of the worst-performing games to prove a point and the game runs at Ulta on the 4080. Spoilers: It doesn't run at Ultra on the PS5, even in the Fidelity Mode. It runs at High with 4x AF and one or two settings at Ultra.

opjB1ho.png

ljgFMIV.png


Those are PS5 settings. Not Ultra.
 
You can exclude CP with RT but you have clear raster comparison of AW2 and PT in GPU limited (graphics modes) places, 4080S is WAY faster than just 2.15x, it's 2.8x and 3x. I don't know that Xbox has anything to do with this comparison.
I brought up Xbox cause that game doesn’t seem to favor the way the ps5 gpu is designed but a gpu of almost the exact same class performs much much better in it showing it isn’t a gpu grunt/strength thing check if the 4080 is 2.5x faster than the series x there it shouldn’t be. Last thing you linked the def benchmarks which I’m trying to avoid they are using drastically better CPU’s in it we were finding benchmarks with CPU’s much closer to the ps5
 
Last edited:
Don't bother. This kid is mentally deranged. He shows a benchmark of one of the worst-performing games to prove a point and the game runs at Ulta on the 4080. Spoilers: It doesn't run at Ultra on the PS5, even in the Fidelity Mode. It runs at High with 4x AF and one or two settings at Ultra.

opjB1ho.png

ljgFMIV.png


Those are PS5 settings. Not Ultra.
your calling people morons yet I was linked a df benchmark which we were avoiding for the aforementioned reasons
 
Don't bother. This kid is mentally deranged. He shows a benchmark of one of the worst-performing games to prove a point and the game runs at Ulta on the 4080. Spoilers: It doesn't run at Ultra on the PS5, even in the Fidelity Mode. It runs at High with 4x AF and one or two settings at Ultra.

opjB1ho.png

ljgFMIV.png


Those are PS5 settings. Not Ultra.
We can test Jedi survivor if you want
 

Bojji

Member
Don't bother. This kid is mentally deranged. He shows a benchmark of one of the worst-performing games to prove a point and the game runs at Ulta on the 4080. Spoilers: It doesn't run at Ultra on the PS5, even in the Fidelity Mode. It runs at High with 4x AF and one or two settings at Ultra.

opjB1ho.png

ljgFMIV.png


Those are PS5 settings. Not Ultra.

Yep and it's the only games that DF tested that performs way below other games on PC (vs PS5).

I brought up Xbox cause that game doesn’t seem to favor the way the ps5 gpu is designed but a gpu of almost the exact same class performs much much better in it showing it isn’t a gpu grunt/strength thing check if the 4080 is 2.5x faster than the series x there it shouldn’t be. Last thing you linked the def benchmarks which I’m trying to avoid they are using drastically better CPU’s in it we were finding benchmarks with CPU’s much closer to the ps5

On paper Xbox has faster GPU so it should perform better than PS5, this SHOULD be the norm - why it isn't is a big mystery.

CPU isn't a factor in those tests but I know I won't convince you.

Edit: On topic, Spider Man 2 debug meny shows us that DRS is aggressive but this is how it should work to keep framerate stable and game delivers on that front. Insomniac is balls deep in sweet baby nonense but they are the best Sony developer when it comes to tech alongside GG (ND is sleeping).
 
Last edited:

ChiefDada

Gold Member
Dunno, I find HFW more impressive. I was initially defending it with the pre-release trailers, especially that first one where Miles glides over the water and sneaks into that plant.

The actual game though? I don't know. Kind of inconsistent. There are some setpieces like the Sandman that look awe-inspiring but at other times, I can hardly spot the difference between 2 and the remastered version on PC/PS5.

I was just expecting a lot more out of it. I thought it would be a crowning achievement and THE game to beat. Overall, I think it looks alright with amazing bits here and there.

Wow. Different strokes...

Well if you favor cinematics, that is an area where Insomniac bemusingly dropped the ball and Horizon truly excels here. But for someone like myself who prioritizes seamless in-game traversal with stable assets and near invisible lod management, Horizon is surprisingly pretty terrible here imo and Spider-Man 2 is GOAT status.
 

Gaiff

SBI’s Resident Gaslighter
Wow. Different strokes...

Well if you favor cinematics, that is an area where Insomniac bemusingly dropped the ball and Horizon truly excels here. But for someone like myself who prioritizes seamless in-game traversal with stable assets and near invisible lod management, Horizon is surprisingly pretty terrible here imo and Spider-Man 2 is GOAT status.
I guess I would need to play them. Just speaking from what I've seen. I did hear that the LOD of FW was pretty bad, especially on flying mounts that are quite slow.

You're right that the fast streaming and traversal of Spidey 2 is a lot better but I just don't find the geometric density very impressive. When I see zoomed-in shots of FW, I'm really amazed by the pristine quality of the geometry and textures. Spidey 2 though? Not really. There might be a lot more going on such as traffic, pedestrians, and all that but I find the average quality to be lower.

I'll probably really be able to judge when FW comes to PC and whenever the PS5 Pro hits so I can try Spider-Man 2 on that.

Oh, and Spider-Man 2 has RT in every mode so it has that going for it.
 
Last edited:

Mr Moose

Member
Oh, and Spider-Man 2 has RT in every mode so it has that going for it.
I thought so, all since launch right?
This comment in the vid/article made me wonder:
Interestingly, this mode now has ray traced reflections on all transparent surfaces, which wasn't the case at launch.
All games should have that fps/res thing in them.
 
Last edited:
Yep and it's the only games that DF tested that performs way below other games on PC (vs PS5).



On paper Xbox has faster GPU so it should perform better than PS5, this SHOULD be the norm - why it isn't is a big mystery.

CPU isn't a factor in those tests but I know I won't convince you.

Why wouldn't they just update it if/when the pro comes out?
They may not want to go back and do a dedicated patch
 

SlimySnake

Flashless at the Golden Globes
Addressed in the video, they did a pixel and performance check to confirm the debug menu results.
No, he did a pixel count in the debug mode to confirm if it matched the on screen readings.

he shouldve done a pixel count in the actual game without debug mode enabled.
 

Zathalus

Member
No, he did a pixel count in the debug mode to confirm if it matched the on screen readings.

he shouldve done a pixel count in the actual game without debug mode enabled.
When he says he did a pixel count the debug menu is not on the screen. But it's not completely clarified either way.

It shouldn't really matter though as a frame-time graph and fps indicator doesn't really impact performance in a notable way, certainly not the GPU at all. No way to be 100% certain of that in this case either.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
When he says he did a pixel count the debug menu is not on the screen. But it's not completely clarified either way.

It shouldn't really matter though as a frame-time graph and fps indicator doesn't really impact performance in a notable way, certainly not the GPU at all. No way to be 100% certain of that in this case either.
There is more going on in Debug modes than just fraps counters.
 

SlimySnake

Flashless at the Golden Globes
Its not even a debug build of the game, what exactly is going on when the performance metrics are on the screen?

debug build?

The debug mode is on. When its enabled, the performance takes a hit because there are other dev tools and process loaded in the background that would otherwise not be enabled.
 

Puscifer

Member
Does it not achieve it's performance targets in all modes for the most part? Is it not the best showcase of dense, open world asset streaming on console and perhaps on any platform. Yeah I'm done defending the blatant tech achievements of SM2. You'll appreciate it once it comes to PC I'm sure.
I'd argue Metro Exodus on consoles still is the best, it has entire RTGI implementation SM2 completely lacks.
 
Last edited:

Zathalus

Member
debug build?

The debug mode is on. When its enabled, the performance takes a hit because there are other dev tools and process loaded in the background that would otherwise not be enabled.
Its not the debug build though, as numerous menu options are flat out not working. It's the regular retail build of the game with the debug menu being enabled for access by mistake.

No debug 'mode' is enabled, it is simply a menu that can be accessed to enable/disable certain features or tools. By this logic, numerous games on PC are always running in a debug mode, just because you can enable debug functions (usually by pressing the tilde key).
 

SlimySnake

Flashless at the Golden Globes
Its not the debug build though, as numerous menu options are flat out not working. It's the regular retail build of the game with the debug menu being enabled for access by mistake.

No debug 'mode' is enabled, it is simply a menu that can be accessed to enable/disable certain features or tools. By this logic, numerous games on PC are always running in a debug mode, just because you can enable debug functions (usually by pressing the tilde key).
Again, this is not the fucking fraps counters. you can literally hack the game by running a dozen different things from the debug menu. why are you arguing over this?
 
I don't see your point. This is Spider-Man 2, probably one of the most CPU-intensive games on PS5. I don't even think it has a 120Hz uncapped mode like the first one which could shoot past 100fps at 1080p. Clearly, this one is much more demanding.
Don’t act like speiderman is the only game you just don’t want it to be the case cause if you had to admit the cpu in the ps5 was holding the gpu back you would have to renig on some of your benchmarking
 

Gaiff

SBI’s Resident Gaslighter
Don’t act like speiderman is the only game you just don’t want it to be the case cause if you had to admit the cpu in the ps5 was holding the gpu back you would have to renig on some of your benchmarking
Once again, shut the fuck up. This is a PS5-only game that we cannot run on PC at the moment so we don't even know the CPU impact. If we run it on a 3600 and still manage to get 100fps, then the game isn't that CPU-intensive. If it tanks to 50-60fps, then clearly it is. This tells us nothing. Do you ever make posts that aren't garbage? I'd put you on ignore but it's a useless feature on most forums.

And those benchmarks I showed you were 4K, we used them to eliminate CPU bottlenecks. Do you seriously think the PS5's CPU is so slow it's causing the GPU to fall below 35fps? Nonsense.
 
Last edited:

Zathalus

Member
Again, this is not the fucking fraps counters. you can literally hack the game by running a dozen different things from the debug menu. why are you arguing over this?
Alright, what exactly is being enabled in the background when the frame-time graph is on the screen and what is the impact on GPU utilization?
 

yurinka

Member
On paper Xbox has faster GPU so it should perform better than PS5, this SHOULD be the norm - why it isn't is a big mystery.
It isn't a mistery, it's because of the vastly superior I/O system on PS5. Compressed data (later decompressed via hardware) moving from SSD to memory and memory access from both CPU and GPU allows PS5 achieve a higher percent of its theorical peak performance than in Xbox or PC.

For that reason performs better than supposed equivalents and matches performance of theorically superior hardware.

PS5 basically does a more optimized usage of its memory resouces, allowing their CPU and specially GPU shine more than in other places.
 
Last edited:
Once again, shut the fuck up. This is a PS5-only game that we cannot run on PC at the moment so we don't even know the CPU impact. If we run it on a 3600 and still manage to get 100fps, then the game isn't that CPU-intensive. If it tanks to 50-60fps, then clearly it is. This tells us nothing. Do you ever make posts that aren't garbage? I'd put you on ignore but it's a useless feature on most forums.

And those benchmarks I showed you were 4K, we used them to eliminate CPU bottlenecks. Do you seriously think the PS5's CPU is so slow it's causing the GPU to fall below 35fps? Nonsense.
Once again stop being a prick believing you have any high ground right now. A couple months ago you were suggesting rt was just a gpu limitation until I revealed it’s also a major cpu one as well. You link benchmarks with 13900ks and 7800x3ds after we specifically said not to cause you wanted to ignore what was said. I don’t think your a troll like what the other guy said about you I believe your butthurt as a pc player
 

Gaiff

SBI’s Resident Gaslighter
Once again stop being a prick believing you have any high ground right now. A couple months ago you were suggesting rt was just a gpu limitation until I revealed it’s also a major cpu one as well.
I never suggested that. RT is both CPU and GPU-intensive. Stop making shit up. "Until I revealed," you know nothing and haven't got jack to teach me.
You link benchmarks with 13900ks and 7800x3ds after we specifically said not to cause you wanted to ignore what was said. I don’t think your a troll like what the other guy said about you I believe your butthurt as a pc player
We already explained to you 1M times. I won't repeat myself again. You always use CPU-limitation as a cop-out. Why the fuck do you think I used 4K benchmarks and not 1080p? Because at 4K, when games run at 30fps on a PS5, the CPU is a non-factor. Throwing in a 13900K would change fuck-all.

And yes, I think you're a troll because you act like a brick wall. We have to repeat the same thing to you over and over again and you come back in the next thread with the exact same stupid points every time. How many times have we told you to use the multi-quote feature yet you still posts 10 times in a row?

Stop quoting me, I won't answer you. You just annoy me with your nonsense. Exchanges with you aren't insightful or useful, they're moronic and excruciating.
 
Last edited:
I never suggested that. RT is both CPU and GPU-intensive. Stop making shit up. "Until I revealed," you know nothing and haven't got jack to teach me.

We already explained to you 1M times. I won't repeat myself again. You always use CPU-limitation as a cop-out. Why the fuck do you think I used 4K benchmarks and not 1080p? Because at 4K, when games run at 30fps on a PS5, the CPU is a non-factor. Throwing in a 13900K would change fuck-all.

And yes, I think you're a troll because you act like a brick wall. We have to repeat the same thing to you over and over again and you come back in the next thread with the exact same stupid points every time. How many times have we told you to use the multi-quote feature yet you still posts 10 times in a row?

Stop quoting me, I won't answer you. You just annoy me with your nonsense. Exchanges with you aren't insightful or useful, they're moronic and excruciating.
Not when rt is involved and we were benching the ps5 against 30fps it obviously wouldn’t be cpu limited we were benching agains the unlocked frame rate modes which hover in the 40-55 range most of the time you still think there was never a cpu limit?
 
Last edited:
Top Bottom