• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Death Stranding - In-depth PS5 vs Nvidia & AMD PC - Performance Deep Dive

What's interesting in this analysis is that we finally have an incredibly honest comparison between console and PC. Often people are comparing PC average framerate against console absolute minimum framerate.

So they'll say for instance "hey, I run this at native 4K at 80fps on my PC. Consoles run like shit at 50fps". When they actually compare an average uncapped FPS against an absolute minimum framerate on consoles.
 
Last edited:
Man, the abuse that would get hurled at me for telling 750 Ti was a weaker GPU than PS4 GPU at the time, lol.

Much better when you take PS5's shared bandwidth into account. The 5700 XT gets all 448GB/s to itself, the PS5 doesn't, yet it has a higher avg. fps than a PC with 5700 XT.
I doubt that will present itself to be an issue very often. Ps5 is also about 10% faster than a 5700xt so it performing 10% better makes sense to me.
 

Md Ray

Member
Put your display on 120Hz or higher and enable Vsync in the menu. It should uncap 60 softlock.
Tried both 144Hz as well as 120Hz, enabled in-game Vsync, set framerate limit to 240 and the cutscenes still lock at 60fps. This is so frustrating.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I hope SlimySnake SlimySnake is watching this.

The video just proves how wrong this statement is, for this particular title.

I agree when evaluating perf between multiple PC GPUs you should stick with a single powerful CPU, but as I said in my initial post, at 4K, CPU doesn't really matter much for this game -- especially when the graphics card you're using is something like 2070/5600 XT at 4K which is already dipping into mid-40-30fps. It was very much a GPU-bound stress test at that point.
I am sorry but he didnt pair it with the 3600xt in this second video. I dont know why this is so hard. It takes two seconds to take the GPU out of one PC and put it another.

Not sure why he has to go find other PC comparisons on youtube. he literally has all the tools he needs to do the comparison himself.

All I am asking for is a fair comparison. There is a 99% chance his tests are identical, but we NEED to do it the right way otherwise wtf is the point. The cutscene he ran where Higgs first shows up is a great comparison point because thats where the PS5 drops the most frames. Lets take a 2070, put it in the same machine as the 3600 and run the tests.
 

yamaci17

Member
w
Tried both 144Hz as well as 120Hz, enabled in-game Vsync, set framerate limit to 240 and the cutscenes still lock at 60fps. This is so frustrating.
why so? played kena with 24 fps pre rendered cutscenes. never got frustrated myself. do I have low expectations myself? i dont know
 

SlimySnake

Flashless at the Golden Globes
Tried both 144Hz as well as 120Hz, enabled in-game Vsync, set framerate limit to 240 and the cutscenes still lock at 60fps. This is so frustrating.
This is an issue with Kojipro games. I recently ran MGSV and even after removing the fps cap from the config files, it would cap at 105 fps. The GPU utilization would be around 35% at native 4k 105 fps so it could easily push another 200 fps but nope. Their engine just couldnt do it.

What you can do is rush through chapter 2. You can probably make all the deliveries in a couple of hours and head to the place where Higgs first shows up. Or see if you can find a save file. IIRC, you can replay all the bosses and the cutscenes should play before it.
 
Tried both 144Hz as well as 120Hz, enabled in-game Vsync, set framerate limit to 240 and the cutscenes still lock at 60fps. This is so frustrating.
Oh I keep forgetting people still not forcing vsync off in nvidia control panel after every driver install for some mysterious reason.

You do not need Vsync with VRR display [cap your frames with RTSS and whoila], I keep repeating myself over and over and yet some random d-asses always keep linking me to some articles written by some illinformed individuals.
 

yamaci17

Member
Oh I keep forgetting people still not forcing vsync off in nvidia control panel after every driver install for some mysterious reason.

You do not need Vsync with VRR display [cap your frames with RTSS and whoila], I keep repeating myself over and over and yet some random d-asses always keep linking me to some articles written by some illinformed individuals.

oh mighty stronk squirrel, i really strongly agree with you. i also saw that vsync recommendation in some websites like blurbuster. eventually, that knowledge and "tip" made it into mainstream discussions and now everyone and their mother will suggest that combo of framelimit + vsync enabled.

for me, whenever i enable vsync i can feel some weird extra lag. i tried even setting the games at 120 fps, it almost feels like the combo is not working as it should. i dont even expect nvidia to actually put in some work for it to work. it to me feels like vsync is always on in some weird ways even if i'm way below my max refresh rate.

at long last, since i didn't actually had any issue with just capping framerate to 120-130, i just disabled vsync and happy ever since. i dont even want to pronounced that thing anymore, i just hate it. its just lag ...
 

Md Ray

Member
w

why so? played kena with 24 fps pre rendered cutscenes. never got frustrated myself. do I have low expectations myself? i dont know
No, I meant trying every possible thing to uncap cutscene frame rate, but it still wouldn't, is frustrating. I have no problem playing at 60fps itself.
 
Well here is my 2070s at 4k max settings and v-synced. I guess Super being the operative word as it crushes the 2070 in the video. It's not overclocked and sits at 1950 to 1980Mhz out of the box so it's not a shit card but not some absurd silicon lottery.

I didn't cherry pick other than looking for bits where the PS5 looked to be miles ahead so I've included plenty of shots where the PS5 wins.

It's behind the PS5 but not by that much and Shadowplay eats 2-4 FPS. If a 3070 is 25-30% faster than a 2070s it should absolutely clean house here unless there is something wrong with the rest of the PC.

The images are 1440p as I just screenshotted my recording but the game was running with these settings:

jqNWPIG.png


The UI will be really small on mobile as I didn't bump up the font so I'll list the figures for each set of images.

1)

2070 in video 42fps
PS5 57fps
My 2070s 55fps

aMT6JML.jpg

h1Bk65C.png


2)

2070 in video 41fps
PS5 60fps
My 2070s 54fps

hTPEsTD.jpg

atEep8T.jpg


3)

2070 in video 42fps
PS5 60fps
My 2070s 57fps

WUce09x.jpg

dYsVsVM.png


4)

2070 in video 30fps
PS5 40fps
My 2070s 41fps

xWFoXhH.jpg

KuHmWJB.png


5)

2070 in video 38fps
PS5 52fps
My 2070s 54fps

3ZCCoaQ.jpg

gJk8QLW.png


And here is an average from the end of the cutscene:

Avg fps:
2070 in video 41.63
PS5 56.97
My 2070s 54

WTFfHaw.jpg

49Nq5Ru.jpg


I'd guess PS5 would be between 2080 and 2080s which is no surprise for a game favoring AMD and still impressive to be fair. Not sure why the 2070 in the video performs so poorly but I'm not going to speculate.
 
oh mighty stronk squirrel, i really strongly agree with you. i also saw that vsync recommendation in some websites like blurbuster. eventually, that knowledge and "tip" made it into mainstream discussions and now everyone and their mother will suggest that combo of framelimit + vsync enabled.

for me, whenever i enable vsync i can feel some weird extra lag. i tried even setting the games at 120 fps, it almost feels like the combo is not working as it should. i dont even expect nvidia to actually put in some work for it to work. it to me feels like vsync is always on in some weird ways even if i'm way below my max refresh rate.

at long last, since i didn't actually had any issue with just capping framerate to 120-130, i just disabled vsync and happy ever since. i dont even want to pronounced that thing anymore, i just hate it. its just lag ...
Funny Thank You GIF by MOODMAN


People need to stop using vsync on VRR displays.
 

Md Ray

Member
The cutscene he ran where Higgs first shows up is a great comparison point because thats where the PS5 drops the most frames. Lets take a 2070, put it in the same machine as the 3600 and run the tests.
I can guarantee you it will run exactly the same even there as you can see in the vid the GPU is hitting 99% usage.
What you can do is rush through chapter 2. You can probably make all the deliveries in a couple of hours and head to the place where Higgs first shows up. Or see if you can find a save file. IIRC, you can replay all the bosses and the cutscenes should play before it.
That's the plan. Will see what I can do.
Oh I keep forgetting people still not forcing vsync off in nvidia control panel after every driver install for some mysterious reason.

You do not need Vsync with VRR display [cap your frames with RTSS and whoila], I keep repeating myself over and over and yet some random d-asses always keep linking me to some articles written by some illinformed individuals.
Things I tried:

-Put display refresh rate to 120Hz, 144Hz
-Disabled (forced off) both in-game and NVCP Vsync (also tried w/ only in-game Vsync set to on)
-Increased max framerate limit to 240 in-game
-Also used RTSS to limit frames to 120fps (with Vsync off, ofc)

Nothing worked. Cutscenes still render at 60fps. -_-
 
Last edited:

JackSparr0w

Banned
All good atuff and thats even without proper PS5 Code .. DS is in its intrinsics still a ported over Cross Gen Title...
People all the time jump on the "consoles are weak" narrative in the beginning of every gen.
Look the early and infamous DF Comparisions with PS4 vs a GTX 750ti...
PS4 tend loose those Battles although everyone knew that it was the early code and lack of proper API usage what made the GTX750ti pull ahead of PS4.
Also the Advantages of 8GB DDR5 Ram and its need was dismissed in Forums.
PC Folklore was ,, 2GB of Vram will be fine"
Look now how you play Far Cry 6 or HZD on such cards...

PS5 will pull away from even cards like a RTX3070 in a few years on a regular basis.

I'm not sure what you think happened to the 750ti? It's still slightly ahead of an original ps4 even if nvidia doesn't bring driver optimizations for it anymore.

Maybe there are some VRAM limiting scenarios but in raw performance the 750ti is still more capable than a ps4 and lots of pc gamers still happily game with it.

The ps5 will never be ahead of the 3070. If the ps5 performs above it's expectations in Death Stranding DC all it means is that some game engine optimizations were brought in that haven't made their way to the PC version.
 
Last edited:
I can guarantee you it will run exactly the same even there as you can see in the vid the GPU is hitting 99% usage.

That's the plan. Will see what I can do.

Things I tried:

-Put display refresh rate to 120Hz, 144Hz
-Disabled (forced off) both in-game and NVCP Vsync (also tried w/ only in-game Vsync set to on)
-Increased max framerate limit to 240 in-game
-Also used RTSS to limit frames to 120fps (with Vsync off, ofc)

Nothing worked. Cutscenes still render at 60fps. -_-

No you have to keep vsync on in games menu! I know nvcp overwrites this setting, but try it - it won't enable vsync. And maybe try setting "prefered refresh rate" to "highest available" in nvcp.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Well here is my 2070s at 4k max settings and v-synced. I guess Super being the operative word as it crushes the 2070 in the video. It's not overclocked and sits at 1950 to 1980Mhz out of the box so it's not a shit card but not some absurd silicon lottery.

I didn't cherry pick other than looking for bits where the PS5 looked to be miles ahead so I've included plenty of shots where the PS5 wins.

It's behind the PS5 but not by that much and Shadowplay eats 2-4 FPS. If a 3070 is 25-30% faster than a 2070s it should absolutely clean house here unless there is something wrong with the rest of the PC.

The images are 1440p as I just screenshotted my recording but the game was running with these settings:

jqNWPIG.png


The UI will be really small on mobile as I didn't bump up the font so I'll list the figures for each set of images.

1)

2070 in video 42fps
PS5 57fps
My 2070s 55fps

aMT6JML.jpg

h1Bk65C.png


2)

2070 in video 41fps
PS5 60fps
My 2070s 54fps

hTPEsTD.jpg

atEep8T.jpg


3)

2070 in video 42fps
PS5 60fps
My 2070s 57fps

WUce09x.jpg

dYsVsVM.png


4)

2070 in video 30fps
PS5 40fps
My 2070s 41fps

xWFoXhH.jpg

KuHmWJB.png


5)

2070 in video 38fps
PS5 52fps
My 2070s 54fps

3ZCCoaQ.jpg

gJk8QLW.png


And here is an average from the end of the cutscene:

Avg fps:
2070 in video 41.63
PS5 56.97
My 2070s 54

WTFfHaw.jpg

49Nq5Ru.jpg


I'd guess PS5 would be between 2080 and 2080s which is no surprise for a game favoring AMD and still impressive to be fair. Not sure why the 2070 in the video performs so poorly but I'm not going to speculate.

Without all the usual console warriors these analysis threads are actually super interesting.
What are the specs of your machine?

P.S We are gonna need another RTX 3070 to come and represent....heck anyone with a 3060Ti should jump in that basically bar for bar a 2080S
 

yamaci17

Member
Without all the usual console warriors these analysis threads are actually super interesting.
What are the specs of your machine?

P.S We are gonna need another RTX 3070 to come and represent....heck anyone with a 3060Ti should jump in that basically bar for bar a 2080S

okay... its my time to shine... then. but i trust Md Ray, i feel like 2070s user is doing a mistake somewhere. is he realy maxing out, i see another > besides very high setting preset

are these cutscenes and locations in the beginning sequence in the game?
 
Last edited:

Md Ray

Member
No you have to keep vsync on in games menu! I know nvcp overwrites this setting, but try it - it won't enable vsync. And maybe try setting "prefered refresh rate" to "highest available" in nvcp.
God damn, it requires these specific combinations to get it to uncap cutscene frame rate:

Force Vsync off in NVCP (I was setting this to "Use 3D app setting" before)
+
Set display refresh from 144Hz to 120Hz in NVCP
+

Set in-game Vsync On

If any one of the above settings differs then it goes back to 60fps again.

Thank you!
cT7gmvk.png
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
HoHo time for business aye
castlevania-demon-army.jpg


I have a 1060 which is basically better than the PS5. Let me know what you want me to test.
Err.....I dont think a GTX 1060 can even get to 30fps at 4K let alone trying to average out 60.
But we are basically getting the average FPS of the opening cutscene of the game.
okay... its my time to shine... then. but i trust Md Ray, i feel like 2070s user is doing a mistake somewhere

are these cutscenes and locations in the beginning sequence in the game?

Right now we are literally comparing the opening cutscene cuz its easy to get too (lol).

Md Ray just posted a way to get the frame rate unlocked if the game is softlocking to 60fps during the cutscene.
Its an issue that was reported at launch i just forgot what hoops had to be done to actually unlock the frame rate.
I remember pausing and unpausing could do it....but I might have actually already don what Md posted below.
God damn, it requires these specific combinations to get it to uncap cutscene frame rate:

Force Vsync off in NVCP (I was setting this to "Use 3D app setting" before)
+
Set display refresh from 144Hz to 120Hz in NVCP
+

Set in-game Vsync On

If any one of the above settings differs then it goes back to 60fps.

Thank you!
cT7gmvk.png

Whats your average frame rate now?
 
Last edited:
Without all the usual console warriors these analysis threads are actually super interesting.
What are the specs of your machine?

P.S We are gonna need another RTX 3070 to come and represent....heck anyone with a 3060Ti should jump in that basically bar for bar a 2080S
I have a 3060ti but don't have the game. Feel free to start a go fund me for it.
 

jroc74

Phone reception is more important to me than human rights
Amazing he had to do a follow up video just to re iterate its about the gpu.

Are ppl still doubting the gpu in the PS5, or still calling it RDNA 1...

Interesting if so.
 

SlimySnake

Flashless at the Golden Globes
I'd guess PS5 would be between 2080 and 2080s which is no surprise for a game favoring AMD and still impressive to be fair. Not sure why the 2070 in the video performs so poorly but I'm not going to speculate.
The 2070 performs even worse than a 5700 even though it should be on par with the 5700xt. This game is basically like AC Valhalla and other AMD focused titles.

NX Gamer says that his overclocked 2070 is equivalent to a 2070 Super. Your tests confirm that it isnt even close.

8DU6CKa.png
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I'm not sure what you think happened to the 750ti? It's still slightly ahead of an original ps4 even if nvidia doesn't bring driver optimizations for it anymore.

Maybe there are some VRAM limiting scenarios but in raw performance the 750ti is still more capable than a ps4 and lots of pc gamers still happily game with it.

The ps5 will never be ahead of the 3070. If the ps5 performs above it's expectations in Death Stranding DC all it means is that some game engine optimizations were brought in that haven't made their way to the PC version.
Iam not sure what YOU think happend to the GTX 750 ti..
It absolutly getting crushed by PS4 in all Major AAA Titles lately .
And all this is even with a much much faster CPU all the time since nobody actually uses that octa core Jaguar APU ..
Consoles Optimisation is not Myth but real..
 

SlimySnake

Flashless at the Golden Globes
in the deathloop thread wasn't he saying it was a whole system comparison to validate his conclusions? and now he has gone the other way here, why the change?
Yeah, I remember getting into an argument with him over that. However, the 2700 is actually a good test to compare the PS5 GPU to these PC GPUs since the PS5 GPU is probably being held back by its paired down zen 2 CPU. Alex never takes that into account when doing his comparisons. So NX gamer is on to something here.
 

hlm666

Member
Yeah, I remember getting into an argument with him over that. However, the 2700 is actually a good test to compare the PS5 GPU to these PC GPUs since the PS5 GPU is probably being held back by its paired down zen 2 CPU. Alex never takes that into account when doing his comparisons. So NX gamer is on to something here.
If the DC comes to pc like the rumor mills are saying and the pc gets any improvements like it did with flight sim benefiting from the console work do you think he will do another video to update any pc improvements? yeh me either.
 

yamaci17

Member
Yeah, I remember getting into an argument with him over that. However, the 2700 is actually a good test to compare the PS5 GPU to these PC GPUs since the PS5 GPU is probably being held back by its paired down zen 2 CPU. Alex never takes that into account when doing his comparisons. So NX gamer is on to something here.
i don't think so, console cpus work more efficiently

here are the problems

- baseline api optimization. non existent on pc
- nvidia driver overhead.


i have very relevant test results from an anandtech user,


codTjLq.png
moRICzP.png


practically, if you pair a ryzen 2700x with an amd gpu instead of an nvidia gpu, you magically get %20-25 more CPU bound frames due to nvidia overhead

since ps5/xbox all run on amd hardware, i'm pretty sure the 2700 is at a huge disadvantage here when paired with an nvidia gpu

then again however, its not the only factor. it is clear that console games are less cpu bound in general thanks to their API and stuff. remember again, battlefield 5 and 2042 runs 45-60 fps on a ps4 which as 1.6 ghz jaguar cores on 64 player maps. on the other end of the spectrum, fx 8350 struggles to push barely 45-50 fps in bf5 even when clocked at 4.5 ghz and at lowest settings ( i know it has fake cores and ps4 has real cores but then again, its 3 times the freq difference and ps4 can only allocate 6 cores to games so it kinda evens out)

;

this is also why their 5600xt + 3600 system is much more stable in framerates and have a bit of more avg. compared to their. 2700+2070 system

you can see %99 gpu util. it can be meaningless. i've seen both %99 in certain games with various different cpus and avg. framerate can differ even in this condition. so 2700 probably soft bottlenecks the 2070 even though it appears to be there's no bottleneck. on the other hand 3600+5600xt system is enjoying no driver overhead bottleneck and pushes even more solid cpu performance, and of course, zen 2 by itself is superior to zen+

in other words, even if we accept "2700x" as a substitute for ps5 cpu, you still have to use a cpu that is at least %25 faster than 2700x if you're going to compare nvidia gpu with the consoles to get the best optimal cpu bound situation. because even if 2700x can perform like a ps5 cpu, it will still be at a %25 disadvantage, which can be huge, since with such high end gpus, there's no room for error and performance loss
 
Last edited:

NXGamer

Member
Well here is my 2070s at 4k max settings and v-synced. I guess Super being the operative word as it crushes the 2070 in the video. It's not overclocked and sits at 1950 to 1980Mhz out of the box so it's not a shit card but not some absurd silicon lottery.

I didn't cherry pick other than looking for bits where the PS5 looked to be miles ahead so I've included plenty of shots where the PS5 wins.

It's behind the PS5 but not by that much and Shadowplay eats 2-4 FPS. If a 3070 is 25-30% faster than a 2070s it should absolutely clean house here unless there is something wrong with the rest of the PC.

The images are 1440p as I just screenshotted my recording but the game was running with these settings:

jqNWPIG.png


The UI will be really small on mobile as I didn't bump up the font so I'll list the figures for each set of images.

1)

2070 in video 42fps
PS5 57fps
My 2070s 55fps

aMT6JML.jpg

h1Bk65C.png


2)

2070 in video 41fps
PS5 60fps
My 2070s 54fps

hTPEsTD.jpg

atEep8T.jpg


3)

2070 in video 42fps
PS5 60fps
My 2070s 57fps

WUce09x.jpg

dYsVsVM.png


4)

2070 in video 30fps
PS5 40fps
My 2070s 41fps

xWFoXhH.jpg

KuHmWJB.png


5)

2070 in video 38fps
PS5 52fps
My 2070s 54fps

3ZCCoaQ.jpg

gJk8QLW.png


And here is an average from the end of the cutscene:

Avg fps:
2070 in video 41.63
PS5 56.97
My 2070s 54

WTFfHaw.jpg

49Nq5Ru.jpg


I'd guess PS5 would be between 2080 and 2080s which is no surprise for a game favoring AMD and still impressive to be fair. Not sure why the 2070 in the video performs so poorly but I'm not going to speculate.
Always great to get more info, I have compared my 2070oc to a 2070super in another video here.



Can you capture your run and upload to YT on this scene?
 
Well here is my 2070s at 4k max settings and v-synced. I guess Super being the operative word as it crushes the 2070 in the video. It's not overclocked and sits at 1950 to 1980Mhz out of the box so it's not a shit card but not some absurd silicon lottery.

I didn't cherry pick other than looking for bits where the PS5 looked to be miles ahead so I've included plenty of shots where the PS5 wins.

It's behind the PS5 but not by that much and Shadowplay eats 2-4 FPS. If a 3070 is 25-30% faster than a 2070s it should absolutely clean house here unless there is something wrong with the rest of the PC.

The images are 1440p as I just screenshotted my recording but the game was running with these settings:

jqNWPIG.png


The UI will be really small on mobile as I didn't bump up the font so I'll list the figures for each set of images.

1)

2070 in video 42fps
PS5 57fps
My 2070s 55fps

aMT6JML.jpg

h1Bk65C.png


2)

2070 in video 41fps
PS5 60fps
My 2070s 54fps

hTPEsTD.jpg

atEep8T.jpg


3)

2070 in video 42fps
PS5 60fps
My 2070s 57fps

WUce09x.jpg

dYsVsVM.png


4)

2070 in video 30fps
PS5 40fps
My 2070s 41fps

xWFoXhH.jpg

KuHmWJB.png


5)

2070 in video 38fps
PS5 52fps
My 2070s 54fps

3ZCCoaQ.jpg

gJk8QLW.png


And here is an average from the end of the cutscene:

Avg fps:
2070 in video 41.63
PS5 56.97
My 2070s 54

WTFfHaw.jpg

49Nq5Ru.jpg


I'd guess PS5 would be between 2080 and 2080s which is no surprise for a game favoring AMD and still impressive to be fair. Not sure why the 2070 in the video performs so poorly but I'm not going to speculate.
I remember you from the Horizon Zero Dawn thread. There your PC also absolutely crushed NXG's. Really weird.
 

SlimySnake

Flashless at the Golden Globes
i don't think so, console cpus work more efficiently

here are the problems

- baseline api optimization. non existent on pc
- nvidia driver overhead.


i have very relevant test results from an anandtech user,


codTjLq.png
moRICzP.png


practically, if you pair a ryzen 2700x with an amd gpu instead of an nvidia gpu, you magically get %20-25 more CPU bound frames due to nvidia overhead

since ps5/xbox all run on amd hardware, i'm pretty sure the 2700 is at a huge disadvantage here when paired with an nvidia gpu

then again however, its not the only factor. it is clear that console games are less cpu bound in general thanks to their API and stuff. remember again, battlefield 5 and 2042 runs 45-60 fps on a ps4 which as 1.6 ghz jaguar cores on 64 player maps. on the other end of the spectrum, fx 8350 struggles to push barely 45-50 fps in bf5 even when clocked at 4.5 ghz and at lowest settings ( i know it has fake cores and ps4 has real cores but then again, its 3 times the freq difference and ps4 can only allocate 6 cores to games so it kinda evens out)
hmm. very interesting. I never new about this. Those benchmarks are nuts.

That said, this would be a great CPU to test AMD cards like the 6600xt and the 6700xt. The two RDNA 2.0 cards that are the closest to the PS5 and XSX. The 2700 performs similar to the PS5 CPU that went on the market recently. The 4800U I believe. So you can use this CPU to do some very interesting comparisons that will either confirm or bust any myth of console coding to the metal. If the cards perform identical when downclocked to match the PS5 and XSX tflops, we can say there is no secret sauce in the PS5's low level API. This will also help point out any advantages the PS5 I/O might have over PC and the XSX.

I am surprised that no one seems to be interested in doing those comparisons. Alex is happy jerking off to 2060 Super comparisons to prove his point, but I think there are far more interesting findings to be had if they stick with these two RDNA 2.0 cards.
 

NXGamer

Member
i don't think so, console cpus work more efficiently

here are the problems

- baseline api optimization. non existent on pc
- nvidia driver overhead.


i have very relevant test results from an anandtech user,


codTjLq.png
moRICzP.png


practically, if you pair a ryzen 2700x with an amd gpu instead of an nvidia gpu, you magically get %20-25 more CPU bound frames due to nvidia overhead

since ps5/xbox all run on amd hardware, i'm pretty sure the 2700 is at a huge disadvantage here when paired with an nvidia gpu

then again however, its not the only factor. it is clear that console games are less cpu bound in general thanks to their API and stuff. remember again, battlefield 5 and 2042 runs 45-60 fps on a ps4 which as 1.6 ghz jaguar cores on 64 player maps. on the other end of the spectrum, fx 8350 struggles to push barely 45-50 fps in bf5 even when clocked at 4.5 ghz and at lowest settings ( i know it has fake cores and ps4 has real cores but then again, its 3 times the freq difference and ps4 can only allocate 6 cores to games so it kinda evens out)

;

this is also why their 5600xt + 3600 system is much more stable in framerates and have a bit of more avg. compared to their. 2700+2070 system

you can see %99 gpu util. it can be meaningless. i've seen both %99 in certain games with various different cpus and avg. framerate can differ even in this condition. so 2700 probably soft bottlenecks the 2070 even though it appears to be there's no bottleneck. on the other hand 3600+5600xt system is enjoying no driver overhead bottleneck and pushes even more solid cpu performance, and of course, zen 2 by itself is superior to zen+

in other words, even if we accept "2700x" as a substitute for ps5 cpu, you still have to use a cpu that is at least %25 faster than 2700x if you're going to compare nvidia gpu with the consoles to get the best optimal cpu bound situation. because even if 2700x can perform like a ps5 cpu, it will still be at a %25 disadvantage, which can be huge, since with such high end gpus, there's no room for error and performance loss
All good info and backs up what I am covering in the video and in previous ones, the thing to remember here though is all these are running at north of 60-70fps and some gains are from 96 to 120's so the gap is and would be smaller at the 30 or 60fps target that almost all consoles games are capped at. The CPU Driver cost would not be so visible or prominent at lower fps as the more you push the thread/IPC, the more the overhead impacts final performance.

As you say Console CPU's are not multi-purpose and do not need such complexity or (until now) target higher than 60 unless VR.
 
Last edited:

yamaci17

Member
maybe its a general ampere bottleneck people, we may not know

it is clear that rtx 3000 series brought up huge numbers like 10k cores and such. but not every core is created equal, never forget that :)

in the end sadly, from gtx 1060 to rtx 3090, nvidia has a problem: frequency. all them gpus run at 1.9 ghz. at least amd made some strides there. games always like frequency.

maybe we're seeing an internal ampere bottleneck here that somehow causes 3070 to not shine or something. but that's an nvidia problem, in the end, ps5 runs at 2.2-2.3 ghz and rdna2 cards are flying high above 2.5 ghz+

maybe in that particular transition scene, frequency becomes the bottleneck. as i've said in my other post, not every %99 gpu is created equal. you can see %99 gpu util in ac:valhalla, but gpu may be consuming 160w. what does 160w mean for a 220w tdp gpu? it means some parts of the gpu is internally bottleneck and not working at all. but core load may present itself as %99. so in this particular case, maybe his 2070super gets higher "computational" usage and 3070 kinda gets underused. but as i've said, its a problem with ampere architecture. i've noticed this problem when i did a comparative benchmark with against my friend's gtx 1080ti in nba 2k19. practically, 3070 failed to scale beyond 1080ti performance in certain old games



in this video, this valuable person talks about something is being awkward on ampere and claims that nvidia's software team can tackle the issue with software optimizations. then again, stuff like this makes me uneasy, if a gpu's performance depends on the game developer and nvidia, its not good. it may even implicate that the ampere may end up like Kepler: alone and forgotten and left behind... :( this is why GCN cards aged better,because AMD kept along with the same architecture over the years.

nvidia however constantly shifts architectures and im pretty sure that is causing troubles for game development
 
Last edited:

yamaci17

Member
All good info and backs up what I am covering in the video and in previous ones, the thing to remember here though is all these are running at north of 60-70fps and some gains are from 96 to 120's so the gap is and would be smaller at the 30 or 60fps target that almost all consoles games are capped at. The CPU Driver cost would not be so visible or prominent at lower fps as the more you push the thread/IPC, the more the overhead impacts final performance.

As you say Console CPU's are not multi-purpose and do not need such complexity or (until now) target higher than 60 unless VR.
in this case, i don't refute you, it is clear that this scene is heavily GPU bound. even a ryzen 5950x clocked at 5 ghz with 4000 mhz ram in tow cannot produce more than 49-50 fps in that scene. it is simple, GPU is chocking and pushes a framerate avg. equal to the PS5 in that particular scene and situation :)
 
Last edited:

Md Ray

Member
Whats your average frame rate now?
Md Ray Md Ray

exact same 49 fps drop with my 3070. i think it is high time we send our cards to RMA :)

C799jkt.jpg
:messenger_tears_of_joy: good one :messenger_ok:

I re-ran/benchmarked the beginning cutscene now with an uncapped frame rate, those dips into 49-55fps are once again happening the same way as before, in the exact same spot (even your card shows the same). But the avg. fps is now around 14% higher than PS5 owing to the unlocked fps.

This time I might upload the vid to YT.

PS5: 49fps
Avg. FPS: 56.91


txuEo3p.png


Here's the OG benchmark with a cap:

D3D12: 50fps
Avg. FPS: 59fps


ngp5qRL.png


Here's the uncapped one. This time I also added the 1% low min fps data:

D3D12: 50fps
Avg. FPS 65fps
1% Low: 46fps


JaTQ2jN.png


Now on to the Higgs cutscene, this might take some time though.
 
Last edited:
Keep that save file. And send it to W Werewolfgrandma and Low Moral Fibre Low Moral Fibre so they can run the same test. That scene drops below 60 fps a lot of times on the PS5. Should be the perfect test for cards like the 3060 Ti which is more powerful than the 2080 Super.
If remote install works this time it will be ready when I get home. If not I'll have to download it then. 1.5gb internet so it shouldn't take too long.

Edit
I actually do want to play the game. I always play open world with as little HUD add possible and this game seems built for that. So if it has major Spoilers I'll just play it to that point tonight if it's not too far.
 
Last edited:

jroc74

Phone reception is more important to me than human rights
in the deathloop thread wasn't he saying it was a whole system comparison to validate his conclusions? and now he has gone the other way here, why the change?
He specifically addressed what the follow up video was for, about.
 

SlimySnake

Flashless at the Golden Globes
If remote install works this time it will be ready when I get home. If not I'll have to download it then. 1.5gb internet so it shouldn't take too long.

Edit
I actually do want to play the game. I always play open world with as little HUD add possible and this game seems built for that. So if it has major Spoilers I'll just play it to that point tonight if it's not too far.
Actually, Id highly recommend that you just burn through Chapter 2. You dont have to use his save file, but please get out of that Chapter 2 area asap. Everyone gets stuck there doing side missions and then loses interest. The game does not start until Chapter 3.

I spent 15 hours in Chapter 2. I am pretty sure I can complete it in 2 hours now. It might take you 4-5 since you havent played it but the first few hours of the game are punishing as fuck on purpose. So keep that in mind. It gets much better.
 

TrebleShot

Member
This is an interesting comparison for pure tech buffs but doesn’t really make too much sense.

Id liken it to comparing a Tesla to a Ferrari two very different cars with different architecture ultimately doing the same thing.

Having a high end gaming PC does not mean you have the ultimate gaming device there are things the PC can do with raw power that the PS5 can only dream of (high res with high FR) but other things that aren’t taken into consideration tip the balance the other way such as accessibility, quality of content and a not so huge delta in performance to mitigate the price.

Not to mention it’s just nice to play on a console and specifically the PS5 it just feels different.

Another comparison I believe to be similar is like an iPhone Vs latest top of the range Android.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
my contribution
i7 7700 + 1080ti at 4k max setting : the intro is 60fps 80% of the time
there are times at 50-53 and 57
and the specific moment is 43fps....
Thats actually really impressive for a 1080ti.
The card that keeps on giving.

Did you manage to get the average by the end of the cutscene?
 

Dream-Knife

Banned
hmm. very interesting. I never new about this. Those benchmarks are nuts.

That said, this would be a great CPU to test AMD cards like the 6600xt and the 6700xt. The two RDNA 2.0 cards that are the closest to the PS5 and XSX. The 2700 performs similar to the PS5 CPU that went on the market recently. The 4800U I believe. So you can use this CPU to do some very interesting comparisons that will either confirm or bust any myth of console coding to the metal. If the cards perform identical when downclocked to match the PS5 and XSX tflops, we can say there is no secret sauce in the PS5's low level API. This will also help point out any advantages the PS5 I/O might have over PC and the XSX.

I am surprised that no one seems to be interested in doing those comparisons. Alex is happy jerking off to 2060 Super comparisons to prove his point, but I think there are far more interesting findings to be had if they stick with these two RDNA 2.0 cards.
I would also like to see a 6800 or 6800xt with its clocks turned down to test the frequency vs CU theory.
maybe its a general ampere bottleneck people, we may not know

it is clear that rtx 3000 series brought up huge numbers like 10k cores and such. but not every core is created equal, never forget that :)

in the end sadly, from gtx 1060 to rtx 3090, nvidia has a problem: frequency. all them gpus run at 1.9 ghz. at least amd made some strides there. games always like frequency.

maybe we're seeing an internal ampere bottleneck here that somehow causes 3070 to not shine or something. but that's an nvidia problem, in the end, ps5 runs at 2.2-2.3 ghz and rdna2 cards are flying high above 2.5 ghz+

maybe in that particular transition scene, frequency becomes the bottleneck. as i've said in my other post, not every %99 gpu is created equal. you can see %99 gpu util in ac:valhalla, but gpu may be consuming 160w. what does 160w mean for a 220w tdp gpu? it means some parts of the gpu is internally bottleneck and not working at all. but core load may present itself as %99. so in this particular case, maybe his 2070super gets higher "computational" usage and 3070 kinda gets underused. but as i've said, its a problem with ampere architecture. i've noticed this problem when i did a comparative benchmark with against my friend's gtx 1080ti in nba 2k19. practically, 3070 failed to scale beyond 1080ti performance in certain old games



in this video, this valuable person talks about something is being awkward on ampere and claims that nvidia's software team can tackle the issue with software optimizations. then again, stuff like this makes me uneasy, if a gpu's performance depends on the game developer and nvidia, its not good. it may even implicate that the ampere may end up like Kepler: alone and forgotten and left behind... :( this is why GCN cards aged better,because AMD kept along with the same architecture over the years.

nvidia however constantly shifts architectures and im pretty sure that is causing troubles for game development

Navi 21 are all factory under 2300. Partner cards can be OC'd more, but never that high.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
:messenger_tears_of_joy: good one :messenger_ok:

I re-ran/benchmarked the beginning cutscene now with an uncapped frame rate, those dips into 49-55fps are once again happening the same way as before, in the exact same spot (even your card shows the same). But the avg. fps is now around 14% higher than PS5 owing to the unlocked fps.

This time I might upload the vid to YT.

PS5: 49fps
Avg. FPS: 56.91


txuEo3p.png


Here's the OG benchmark with a cap:

D3D12: 50fps
Avg. FPS: 59fps


ngp5qRL.png


Here's the uncapped one. This time I also added the 1% low min fps data:

D3D12: 50fps
Avg. FPS 65fps
1% Low: 46fps


JaTQ2jN.png


Now on to the Higgs cutscene, this might take some time though.
You know something I have brought up time and time again is how I really dislike these 1% low benchmarks Alex does in his games when he finds the worst possible point, and uses that to base all of his benchmarks. You can see from your average that the actual framerate is much higher. 46 vs 50 1% compared to 56 vs 65 on Average.

If you go by 1% minimum, the 3070 is 8% faster than the PS5. If you go by the Average, the 3070 is 16% faster.

What's curious and kind of funny is that Alex did make an exception when it came to Control, and decided to take an average of around 20 or so different areas in the game. Why? Because his 1% minimum location in that game, the infamous corridor of doom with tons of reflections showed just a 1 fps gap between the PS5 and XSX. Well, we cant have that so he went and got benchmarks at other locations in the game. How convenient.

Hy7sn8m.jpg
 
Last edited:
Top Bottom