• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Death Stranding - In-depth PS5 vs Nvidia & AMD PC - Performance Deep Dive

Jack Uzi

Banned
This analysis feels like as a jab At Mr. Alexander Battaglia with his all time fire to make them console gpus be equivalent to a rtx2060s.

Anyway good thorough analysis.
He does it more often that you know it. One other example was after the Halo video Battleborg made about ray tracing. NX was like ray tracing won’t solve it ~ not his exact words but something to that effect.
 

Zathalus

Member
Low level API (GNM) instead of directX compatible (but slower) API (GNMX) that is very probably used in many multi-plat games.
Multiplatform games have been using GNM since 2013. GNMX is similar to DX11 so I doubt many developers use it. For that matter, I doubt the PS5 still supports the use of GNMX for native games. Microsoft dropped support for DX11 for the Series consoles for example.
 

Stooky

Member
Striking distance?
If its failing to hold 60 how is an extra 20 to 30 frame advantage "striking" distance.

Thats the same as me saying a game that runs at 30fps is striking distance from running at 60fps.

No its not.

3070s regularly average in 80s.....averaging under 60fps is a completely different class of card.
Like thats a 2080 class card....what we have always compared the PS5 to.

Why he chose a card that very clearly outclasses the PS5 is beyond me.
Cuz 60fps and 80+fps isnt close.
If the game holds 60fps most of the time. most likely its hitting above 60 in some scenarios. We'll never know because its locked. I wouldn't be surprised if PS5 Death Stranded is averages around 70fps depending on settings.

EDIT: it depends on studio philosophy. Ive worked in some studios that believed that for 60fps to go on the box, meant the game held 60fps the majority of time. For us the game was hitting well above 60fps averaging in the 70's. We didnt unlock it because of frame pacing issues with TVs.
 
Last edited:
Multiplatform games have been using GNM since 2013. GNMX is similar to DX11 so I doubt many developers use it. For that matter, I doubt the PS5 still supports the use of GNMX for native games. Microsoft dropped support for DX11 for the Series consoles for example.
GNMX is the directX-like API available on all Playstations in order to ease ports from PC. It's not "DirectX 11".
 

Md Ray

Member
Nope. You just cant say something is GPU bound or if something is a GPU stress test without testing both GPUs with the same set of CPUs. Anyone can just wave their hands and say something is GPU bound. Thats not how any of this works.
Like you're doing now...

At least I have data to back up my claims.

Here's 2700X vs top-tier CPUs w/ RX 6900 XT in Death Stranding (benchmark by: HUB/TechSpot)

Like I've been saying, Zen+ Ryzen is plenty capable of running this game at well over 60+fps, look at the 1080p result: it's averaging 150+fps. Now, look at the 4K results.

DS.png

"If he had paired it with the 3600 CPU, the 2070 will perform better."
Sure...
 

Zathalus

Member
GNMX is the directX-like API available on all Playstations in order to ease ports from PC. It's not "DirectX 11".
I said it's similar, not that it is the same. GNMX is roughly equivilant in functionality to DX11 while DX12 was designed to be very similar to GNM in terms of low-level access to the GPU hardware.
 
Like you're doing now...

At least I have data to back up my claims.

Here's 2700X vs top-tier CPUs w/ RX 6900 XT in Death Stranding (benchmark by: HUB/TechSpot)

Like I've been saying, Zen+ Ryzen is plenty capable of running this game at well over 60+fps, look at the 1080p result: it's averaging 150+fps. Now, look at the 4K results.

DS.png


Sure...
Quit bickering and work on that 3070 report please

This is a joke
 

winjer

Gold Member
This is a terrible test.
He uses two GPUs that have different archs to the PS5.
One is an RTX 2070 with lower shader throughput, even lower pixel fillrate and texture fillrate, than the PS5.
The other is an RDNA1 with a reduced memory bus and even lower specs.
On a game that runs significantly better on AMD GPUs. In this game an RX 5700XT is as fast as a RTX 2080.
he could at least get an RX 6700XT. Underclocked it a bit and get similar specs to the PS5 GPU.
 
Results aren't super surprising. The general trend we've seen is that the PS5 can match a 2070S in raster, or a 2060S with RT on. Average of the scene tested came to 56.85FPS on the PS5, and NXG estimates a 10% penalty due to hardcap of 60 vsynced. Adjusting for that Vsync penalty we get an average FPS of just over 60FPS. You know what else gets an average just over 60FPS at 4K? A 2070S of course (granted this is at very high quality, and not PS5 equivalent settings). Other trends are also visible: the 2070 is on average 13% faster than the 5600XT (same as NXG results w/o vsync,. w/o vsync is how GPUs are tested). 2070S is 20% faster than the 2070 basic, (vs. 25% in the NXG tested scene, the minor difference is probably down to the specifics of the scene tested).

YsKKIFE.png
 

NXGamer

Member
Can I say, I am really happy to see this kind of discussion here, exactly what I intend from most of my videos. People asking questions, fact checking, testing themselves and looking beyond the basics that most concentrate on.

Excellent to see.

Also, I have an Aperitif video uploading now that just compliments this with some more info and validation tests which may also add more meat on the bones.

Thanks
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If the game holds 60fps most of the time. most likely its hitting above 60 in some scenarios. We'll never know because its locked. I wouldn't be surprised if PS5 Death Stranded is averages around 70fps depending on settings.

EDIT: it depends on studio philosophy. Ive worked in some studios that believed that for 60fps to go on the box, meant the game held 60fps the majority of time. For us the game was hitting well above 60fps averaging in the 70's. We didnt unlock it because of frame pacing issues with TVs.
Im not disputing overhead, good game development is to ensure you have some overhead for extenuating factors.....but you dont realistically leave 30fps on the table.....why would you?
But 3070s will average ~80fps
In scenes where the PS5 isnt averaging 60fps do you really think removing the frame lock will suddenly get it all the way to 80+fps?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Can I say, I am really happy to see this kind of discussion here, exactly what I intend from most of my videos. People asking questions, fact checking, testing themselves and looking beyond the basics that most concentrate on.

Excellent to see.

Also, I have an Aperitif video uploading now that just compliments this with some more info and validation tests which may also add more meat on the bones.

Thanks
star-wars-the-mandalorian.gif


P.S Quite a number of console warriors are still banned and or have learnt their lessons.
 
Can I say, I am really happy to see this kind of discussion here, exactly what I intend from most of my videos. People asking questions, fact checking, testing themselves and looking beyond the basics that most concentrate on.

Excellent to see.

Also, I have an Aperitif video uploading now that just compliments this with some more info and validation tests which may also add more meat on the bones.

Thanks
I like my comparison videos like I like my women. Little more meat on the bones.
 

Dream-Knife

Banned
We know for a fact that a 3070 is more powerful than a PS5, there's no debate. So either the testing methodology is flawed, or the something is going wrong with the game.

This is a terrible test.
He uses two GPUs that have different archs to the PS5.
One is an RTX 2070 with lower shader throughput, even lower pixel fillrate and texture fillrate, than the PS5.
The other is an RDNA1 with a reduced memory bus and even lower specs.
On a game that runs significantly better on AMD GPUs. In this game an RX 5700XT is as fast as a RTX 2080.
he could at least get an RX 6700XT. Underclocked it a bit and get similar specs to the PS5 GPU.
Isn't the PS5 like an underclocked 6600xt? FP32 wise the 6600xt is slightly more than a PS5.
 

ClosBSAS

Member
No one seems to be talking about load times:

It's not apples to apple comparison, as in, the PC is using 3.4 GB/s SSD vs 5.5 GB/s on PS5, but it still highlights that despite PS5's SSD having around 1.6x higher read bandwidth, it's around 3x faster in loading. Something that DStorage should be able to fix once game devs take advantage of it on the PC side. Of course, the devs would have to go back and patch the game on PC in order to take full advantage of NVMe SSD just like they do for PS5.

3yVJkB6.png
loo its nowhere near 12 seconds...wtf...i never got past 5 seconds on the loading with an evo 970.
 

winjer

Gold Member
We know for a fact that a 3070 is more powerful than a PS5, there's no debate. So either the testing methodology is flawed, or the something is going wrong with the game.


Isn't the PS5 like an underclocked 6600xt? FP32 wise the 6600xt is slightly more than a PS5.

In terms of CU count, there isn't a direct equivalent.
the PS5 has 36 CUs. The 6600XT has 30. And the 6700XT has 40.
All have 64 ROPs. Although they are different.
The 6700XT has more TMUs, but it would be clocked lower, so it would better match the PS5.

There are many more differences. So finding an exact match is impossible.
But a 6700XT underclocked would be a much better match than an 2070 or a 5600XT.
 

Dream-Knife

Banned
In terms of CU count, there isn't a direct equivalent.
the PS5 has 36 CUs. The 6600XT has 30. And the 6700XT has 40.
All have 64 ROPs. Although they are different.
The 6700XT has more TMUs, but it would be clocked lower, so it would better match the PS5.

There are many more differences. So finding an exact match is impossible.
But a 6700XT underclocked would be a much better match than an 2070 or a 5600XT.
6600xt has 32, but yeah. Performance is 10.6 TFLOPS
 
Last edited:

winjer

Gold Member
6600xt has 32, but yeah. Performance is 10.6 TFLOPS

But because of that high speed, it would have much higher pixel fill rate.
Also, the 6600XT is very limited in memory bandwidth. It's half of the PS5 and only has 32MB of L3 cache.
The 6600XT also only has 2MB of L2 cache. The PS5 has 4MB. And the 6700XT has only 3MB of L2.
The 6700XT has a bus of 192bits, but has 96MB of L3 cache, that makes up for it.

As you can see it's very difficult to match exact specs.
 

SlimySnake

Flashless at the Golden Globes
But because of that high speed, it would have much higher pixel fill rate.
Also, the 6600XT is very limited in memory bandwidth. It's half of the PS5 and only has 32MB of L3 cache.
The 6600XT also only has 2MB of L2 cache. The PS5 has 4MB. And the 6700XT has only 3MB of L2.
The 6700XT has a bus of 192bits, but has 96MB of L3 cache, that makes up for it.

As you can see it's very difficult to match exact specs.
Agreed, but the 6600xt and the 6700xt are a far better test than the 2070 and 5600xt.

I think the 6700xt downclocked to hit 12.1 tflops is a great test to confirm if higher clocks offer better performance compared to the 1.825 Ghz XSX.

The tflops difference between a 2.23 Ghz PS5 and a 2.5Ghz 6600xt is literally 0. Yes, the 6600xt has higher clocks but its bound by only a 256 GBps of ram bandwidth so we can rule out any advantages. if the 6600xt does outperform the PS5, we can simply attribute it to Cerny's theory that higher clocks offer better performance.
 

winjer

Gold Member
Agreed, but the 6600xt and the 6700xt are a far better test than the 2070 and 5600xt.

I think the 6700xt downclocked to hit 12.1 tflops is a great test to confirm if higher clocks offer better performance compared to the 1.825 Ghz XSX.

The tflops difference between a 2.23 Ghz PS5 and a 2.5Ghz 6600xt is literally 0. Yes, the 6600xt has higher clocks but its bound by only a 256 GBps of ram bandwidth so we can rule out any advantages. if the 6600xt does outperform the PS5, we can simply attribute it to Cerny's theory that higher clocks offer better performance.

The issue is that the memory subsystem of the 6600XT is very limited, especially if we are going to test at 4K.
HU made a test with the 5700XT and the 6700XT, since they have the same CU count. The 5700XT has a 256 bit bus. The 6700XT 192 bit with 96MB of L3.
They underclocked the 6700XT to the clocks of the 5700XT, and performance was identical.
So it would seem that a 192 bit bus, with 96MB of L3 cache, would be a nice fit for the 256 bit bus of the PS5.
 
I'm convinced these PS5 v PC videos are just an elaborate troll at this stage.

Some of the comments that people are obviously going to band around afterwards (e.g. PS5 performing closer to 3070 level) are wild.

This is HUB so you know there is no DLSS and the 1% lows on a 3070 are 76.

8DU6CKa.png


And here is PS5 dropping to 47 fps.

LKuHGS2.png


There are loads of cards in between the 2070 and 3070 that the PS5 is "performing more in line with".
 

Stooky

Member
Im not disputing overhead, good game development is to ensure you have some overhead for extenuating factors.....but you dont realistically leave 30fps on the table.....why would you?
But 3070s will average ~80fps
In scenes where the PS5 isnt averaging 60fps do you really think removing the frame lock will suddenly get it all the way to 80+fps?
They can't guarantee a stable 80fps is why its not happening, im sure ps5 is not hitting that consistently. they would have to run it unlocked and deal with all fps fluctuations. People would complain. If its holding 60fps most of the time, then i'm sure its averaging 70ish fps most of the time, possibly hitting in close to 80 in some spaces. On 30fps locked tiles ive worked on ive seen dip to 26fps up to 45f-50fps. During developement if we had that much extra frames we would add more graphics features untill the framrate averaged out a little above the fps lock. Development on ports is a little different, being that features are generally already set, on death stranded I think they allocated the extra headroom to the to the different aspect ratios they offer, or maybe the fps just fluctuated to much to make higher frame rate a feature. Who knows?
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Used a 5700xt benchmark of the same scene to compare the PS5 GPU with the 5700xt, and it looks like the PS5 is consistently 5-7 fps better than this overclocked 10.75 tflops 5700xt.

i8iWCaI.jpg


Pretty damn impressive i'd say considering there was no IPC gains between RDNA 1.0 and RDNA 2.0, and the fact that the 5700xt has a massive bandwidth advantage (448 GBps all to itself whereas the PS5 GPU has to share it with the CPU). Also, this benchmark is using a 3950x which is a $750 16 core 32 thread CPU. So we can chalk this up entirely to the PS5 I/O and console APIs.

Timestamped: Run them side by side to see a consistent 5-7 fps advantage.





Based on this, id say the PS5 is performing somewhere between a 2080 and a 2080 Super in this game. This game heavily favors AMD cards seeing as how the 7.9 tflops 5700 is outperforming the 9.3 tflops 2070. So its more in line with games like AC Valhalla which was also in the 2080 Super range.

index.php
 

SlimySnake

Flashless at the Golden Globes
I'm convinced these PS5 v PC videos are just an elaborate troll at this stage.

Some of the comments that people are obviously going to band around afterwards (e.g. PS5 performing closer to 3070 level) are wild.

This is HUB so you know there is no DLSS and the 1% lows on a 3070 are 76.

8DU6CKa.png


And here is PS5 dropping to 47 fps.

LKuHGS2.png


There are loads of cards in between the 2070 and 3070 that the PS5 is "performing more in line with".
Yeah, it's more like a 2080 or a 2080 Super than a 3070 which is basically a 2080 Ti. 2080 Ti hits 100 fps in this game when walking around. PS5 is amazing, but not that amazing.

I think NX Gamer has the right idea. I just saw your DF video, and they dont even bother comparing the PC cards. Ive been begging these guys to grab a 6600xt and 6700xt and do some comparisons between the PS5, the XSX and those PC cards. It is a far better comparison than any 5700xt or 20 series graphics card comparisons. So it's great to see this video, but the comparison parameters are all wrong. He's using the wrong CPU in the 5600xt comparison. He doesn't have the right cards for the comparison. Like why even test the 7 tflops 5600xt? What are you trying to compare? Especially when you have a 9.7 tflops 5700xt you can easily overclock to 10.2 tflops and a 6600xt you can downclock to 10.2 tflops.
 

Dream-Knife

Banned
Used a 5700xt benchmark of the same scene to compare the PS5 GPU with the 5700xt, and it looks like the PS5 is consistently 5-7 fps better than this overclocked 10.75 tflops 5700xt.

i8iWCaI.jpg


Pretty damn impressive i'd say considering there was no IPC gains between RDNA 1.0 and RDNA 2.0, and the fact that the 5700xt has a massive bandwidth advantage (448 GBps all to itself whereas the PS5 GPU has to share it with the CPU). Also, this benchmark is using a 3950x which is a $750 16 core 32 thread CPU. So we can chalk this up entirely to the PS5 I/O and console APIs.

Timestamped: Run them side by side to see a consistent 5-7 fps advantage.





Based on this, id say the PS5 is performing somewhere between a 2080 and a 2080 Super in this game. This game heavily favors AMD cards seeing as how the 7.9 tflops 5700 is outperforming the 9.3 tflops 2070. So its more in line with games like AC Valhalla which was also in the 2080 Super range.

index.php

Do we know what setting presets the PS5 version is using?
 

SlimySnake

Flashless at the Golden Globes
Do we know what setting presets the PS5 version is using?
Thats a good question. I see that NX gamer is still using the default settings that were basically the PS4 Pro settings at the time but i thought DF said the PS5 is using higher LODs and draw distance compared to the PS4 versions. I could be mistaken though.
 

Md Ray

Member
Alright, so gathered some quick and dirty results from my 3070 running at native 4K, no DLSS (same settings as PS5):

We initially see 3070 being 12% ahead on avg.

EYXW9Z6.jpg

9MuErqq.png



ycHuTO7.jpg
a3T1rFG.jpg


Here we see the biggest in-the-moment framerate difference between them (40fps vs 51fps):

xQWGs6a.png
hOschMM.png


But then things kind of start leveling out from here on...

PS5 is performing similarly to my 3070 rig:

fEvYrOc.png
Y8haQlY.png



hx8OXaS.png
tBr2RDz.png


And the avg. fps difference ends up being 4% between 3070 and PS5 GPU...

txuEo3p.png
L6MgvG5.png
 

Md Ray

Member
Thats a good question. I see that NX gamer is still using the default settings that were basically the PS4 Pro settings at the time but i thought DF said the PS5 is using higher LODs and draw distance compared to the PS4 versions. I could be mistaken though.
PS5 is now basically using PC's max setting (including model detail aka LOD/draw distance) and has improved water quality on top, not found on PC. The water quality on PC is same as the PS4 versions.
 
Alright, so gathered some quick and dirty results from my 3070 running at native 4K, no DLSS (same settings as PS5):

We initially see 3070 being 12% ahead on avg.

EYXW9Z6.jpg

9MuErqq.png



ycHuTO7.jpg
a3T1rFG.jpg


Here we see the biggest in-the-moment framerate difference between them (40fps vs 51fps):

xQWGs6a.png
hOschMM.png


But then things kind of start leveling out from here on...

PS5 is performing similarly to my 3070 rig:

fEvYrOc.png
Y8haQlY.png



hx8OXaS.png
tBr2RDz.png


And the avg. fps difference ends up being 4% between 3070 and PS5 GPU...

txuEo3p.png
L6MgvG5.png
What are the ps5 settings?
Edit
See you answered already. Why do I keep reading PC has better draw distance?
 
Last edited:

Dream-Knife

Banned
Thats a good question. I see that NX gamer is still using the default settings that were basically the PS4 Pro settings at the time but i thought DF said the PS5 is using higher LODs and draw distance compared to the PS4 versions. I could be mistaken though.
Also, didn't the PS4 version have dynamic resolution?
Alright, so gathered some quick and dirty results from my 3070 running at native 4K, no DLSS (same settings as PS5):

We initially see 3070 being 12% ahead on avg.

EYXW9Z6.jpg

9MuErqq.png



ycHuTO7.jpg
a3T1rFG.jpg


Here we see the biggest in-the-moment framerate difference between them (40fps vs 51fps):

xQWGs6a.png
hOschMM.png


But then things kind of start leveling out from here on...

PS5 is performing similarly to my 3070 rig:

fEvYrOc.png
Y8haQlY.png



hx8OXaS.png
tBr2RDz.png


And the avg. fps difference ends up being 4% between 3070 and PS5 GPU...

txuEo3p.png
L6MgvG5.png
Is it 3070 or 2070 as shown in your pictures?
 

Md Ray

Member
What are the ps5 settings?
Edit
See you answered already. Why do I keep reading PC has better draw distance?
Because PC offered better draw distance over PS4 Pro.

I also tested with PS4 Pro level (default) draw distance just for good measure, no difference in perf. It was nigh on identical to the max setting.
 
Because PC offered better draw distance over PS4 Pro.

I also tested with PS4 Pro level (default) draw distance just for good measure, no difference in perf. It was nigh on identical to the max setting.
Sorry I meant lod. Even the DF article about this ps5 release mentioned higher lod options
 

Md Ray

Member
Was quite spot on when I said this:
I do think PS5 would be close to 80fps avg. with Vsync off during gameplay, and I also think 3070 will have dips under 60fps at native 4K under certain GPU heavy sections like in those cut-scenes, maybe not to the same extent as PS5 though.

Check this out Black_Stride Black_Stride
Alright, so gathered some quick and dirty results from my 3070 running at native 4K, no DLSS (same settings as PS5):

We initially see 3070 being 12% ahead on avg.

EYXW9Z6.jpg

9MuErqq.png



ycHuTO7.jpg
a3T1rFG.jpg


Here we see the biggest in-the-moment framerate difference between them (40fps vs 51fps):

xQWGs6a.png
hOschMM.png


But then things kind of start leveling out from here on...

PS5 is performing similarly to my 3070 rig:

fEvYrOc.png
Y8haQlY.png



hx8OXaS.png
tBr2RDz.png


And the avg. fps difference ends up being 4% between 3070 and PS5 GPU...

txuEo3p.png
L6MgvG5.png
 
Top Bottom