• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Deathloop PC vs PS5, Optimised Settings, Performance Testing + More

SlimySnake

Flashless at the Golden Globes
What benchmarks do we have showing the PS5 outperforming a 5700 XT?


So at 1.98 GHz vs the 2.64 GHz in the TechPowerup review, we're talking about 10.81 TF vs 10.14 TF. That's a 6.6% advantage. But I don't think we can just assume the 5700 XT being used in the review was averaging those clockspeeds. How do we know they weren't using the reference 5700 XT they used for their original review, which averaged 1.88 GHz?
nah. Most ingame benchmarks ive seen has the 5700xt card running at around 1.95-2.0 ghz during gameplay. It's the same as my rtx 2080 and pretty much every nvidia card starting from Pascal. They run well over even the advertised boost clocks.

YMPx2OJ.jpg
FXspaPx.jpg
NyR4jPJ.jpg
 

Md Ray

Member
since he said higher clocks + narrow design is a huge benefit, which it isn't for GPUs, as demonstrated by easy PC tests that show basically no gain in performance when running a narrow high clock GPU vs a wide slower clock GPU that both are pushing the same amount of FLOP/s

I literally posted several screenshots proving this in this thread, where an RDNA1 GPU has almost identical performance compared to an RDNA2 GPU that is in the same ballpark in terms of FLOP/s, and actually the slower clocked RDNA1 card technically had a slight disadvantage in terms of FLOP/s and still managed to basically be on par.

Cerny wants to sell you consoles, so of course he uses marketing speak to tickle fanboy balls
So you're going to ignore this? Why does 6600 XT have up to 11% higher fps here with just a 6% increase in TF and a much lower bandwidth?

"better perf myth"... It depends from game to game. Why omit CP2077 and Flight Sim results which show up to 11% higher fps on 6600 XT despite the avg. TF difference seems to be around 6% between 5700 XT (9.9 TF) and 6600 XT (10.6 TF)? Because that would go against your narrative.

11% higher avg. fps
FvENg9z.png


10% higher avg. fps
fpAX0Br.png


The 6600 XT also has a massive bandwidth disadvantage compared to 5700 XT, so games favoring more BW will see 5700 XT pulling ahead of 6600 XT. You didn't debunk anything here.
 
Last edited:

Sosokrates

Report me if I continue to console war
Since when a man that design multiple consoles and made multiple patients a liar?

So far everything he said about the PS5 is true.
-Ray Tracing, Decompression, SSD Speed, 3D Audio, Extreme Cooling, etc.

We all know from looking at the PS3 and PS4 how games start to look when they take full advantage of the consoles, and PS5 is no different.

Also, the PS5 is a budget device, comparing it to GPUs that cost much more than the console is just plain retarded. Especially when you take in account for the SSD cost.

The PS5 potential is the sum of it's parts, not just the GPU. It's the developers not optimizing their games to take advantage of the PS5 features is why we have these stupid discussions.

Just like the PS4 having TLOU 2, the PS5 will punch above it's weight and produce games in similar fashion.

Technically there is nothing special about TLOU2. It has similer polycounts,texture quality to other gen 8 games, it uses screen space reflections and cube maps, its global illumination system is baked most of the time.
The reason why TLOU2 looks so good is because the talented artists at naughty Dog have spent a lot of time crafting the environments by hand.
Cerny gave an example that lower cus and higher clocks would produce significant performance improvements over higher cus and lower clockspeed.
I wish he would of shown a demo showing this because people have done this experiment with PC gpus and the higher clocked parts perform the same.
We also are not seeing the PS5 perform significantly better then a pc equipped with a 5700xt(without Raytracing) and the PS5s performance compared to the SeriesX is what you would expect for a gpu with about 18% less tflops.

So why would somone believe what Cerny said regarding clockspeeds when real world examples are not showing this?
 

SirVoltus

Member
On PS5 the game is locked at 60fps 99% of the time (at mostly around >1400p according to both elanalista and VGTech). Usually you need an average quite higher than 60fps in order to have such consistency (maybe like ~70fps). You need to look at 1% min framerate, not average to compare using PC benchmarks.
To some degree yes, but when el analista mention 60 fps for PS5 is not actually 60 fps consistent too ?

1N11kvy.jpg
 

SlimySnake

Flashless at the Golden Globes
Since when a man that design multiple consoles and made multiple patients a liar?

So far everything he said about the PS5 is true.
-Ray Tracing, Decompression, SSD Speed, 3D Audio, Extreme Cooling, etc.

We all know from looking at the PS3 and PS4 how games start to look when they take full advantage of the consoles, and PS5 is no different.

Also, the PS5 is a budget device, comparing it to GPUs that cost much more than the console is just plain retarded. Especially when you take in account for the SSD cost.

The PS5 potential is the sum of it's parts, not just the GPU. It's the developers not optimizing their games to take advantage of the PS5 features is why we have these stupid discussions.

Just like the PS4 having TLOU 2, the PS5 will punch above it's weight and produce games in similar fashion.
I am ok with comparing consoles to PCs for academic purposes. Where draw the line is at DF doing sponsored Nvidia trash like the laptop with a 3060ti. That thing is 1,500 pounds or something crazy like $2,250. Why is it even a comparison? i get that nvidia asked them to do a sponsored video but at some point, you have to say no. They are not a random youtube channel anymore. They need to have some kind of standards.

The PS5 and XSX are around rtx 2060 Super when it comes to ray tracing according to Alex's own tests. That's fine. It's more or less in line with the RDNA 2.0 cards on PC. What's interesting is that in some games like Doom and Control, they are on par with the 2070 Super when they use checkerboard reflections on consoles. And that kind of optimization/downgrade is typical in console games that use techniques like lower framerates for distant animations, lower quality alpha effects like fire rendering etc.

Alex sometimes gets too wrapped in this stuff, but I think what he's trying to do here is give PC users a more optimized experience that you typically dont get with PCs. A lot of the settings on PCs just dont have a visible upgrade and cost a lot of GPU power. By using PS5 settings, you will get the best visuals you can without sacrificing framerate.
 

Loxus

Member
since he said higher clocks + narrow design is a huge benefit, which it isn't for GPUs, as demonstrated by easy PC tests that show basically no gain in performance when running a narrow high clock GPU vs a wide slower clock GPU that both are pushing the same amount of FLOP/s

I literally posted several screenshots proving this in this thread, where an RDNA1 GPU has almost identical performance compared to an RDNA2 GPU that is in the same ballpark in terms of FLOP/s, and actually the slower clocked RDNA1 card technically had a slight disadvantage in terms of FLOP/s and still managed to basically be on par.

Cerny wants to sell you consoles, so of course he uses marketing speak to tickle fanboy balls
Cerny is the Lead Architect of the PS5, while you are just some random poster on a forum that can easily lie with no repercussion.

With the being said, I'll take Cerny's word over anyone.

Like I said the PS5 potential is the sum of all it's parts.
The higher clocks/narrow design maybe there to assist with faster streaming of high quality textures and high poly meshes.

Clearly, you have not seen what the PS5 exclusives look like (Demon's Souls, Returnal, Ratchet & Clank: Rift Apart and not to mention Spider Man: Miles Morales).
And that's only year one exclusive.

Those games are better utilizing the PS5 hardware than this game and it keeps getting better with each release.

You are nothing but a blind hater if think this game determines the PS5's potential.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
"better perf myth"... It depends from game to game. Why omit CP2077 and Flight Sim results which show up to 11% higher fps on 6600 XT despite the avg. TF difference seems to be around 6% between 5700 XT (9.9 TF) and 6600 XT (10.6 TF)? Because that would go against your narrative.

11% higher avg. fps
FvENg9z.png


10% higher avg. fps
fpAX0Br.png


The 6600 XT also has a massive bandwidth disadvantage compared to 5700 XT, so games favoring more BW will see 5700 XT pulling ahead of 6600 XT. You didn't debunk anything here.
Based on these two screens, I am getting a tflops difference of 4-6% but the tflops show a higher 10-11% increase.

The problem is that not every game shows this. Thats why the timespy comparison I showed was far more interesting. 11% performance improvement despite a 32 vs 40 CU disadvantage and possibly a 4-5% tflops advantage.
 

Sosokrates

Report me if I continue to console war
Cerny is the Lead Architect of the PS5, while you are just some random poster on a forum the can easily lie with no repercussion.

With the being said, I'll take Cerny's word over anyone.

Like I said the PS5 potential is the sum of all it's parts.
The higher clocks/narrow design maybe there to assist with faster streaming of high quality textures and high poly meshes.

Clearly, you have not seen what the PS5 exclusives look like (Demon's Souls, Returnal, Ratchet & Clank: Rift Apart and not to mention Spider Man: Miles Morales).
And that's only year one exclusive.

Those games are better utilizing the PS5 hardware than this game and it keeps getting better with each release.

You are nothing but a blind hater if think this game determines the PS5's potential.

If cerny was telling the truth on the clockspeed + cu matter why havent we seen an examples of his claims?
We have seen the opposite, examples showing the PS5s higher clocked GPU perform very close to lower clocked PC GPUs that of equal Tflops and similer architecture.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
If cerny was telling the truth on the clockspeed + cu matter why havent we seen an examples of his claims?
Several people have posted examples in this very thread. You just chose to ignore them. Thats not Cerny's fault.
We have seen the opposite, examples showing the PS5s higher clocked GPU perform very close to lower clocked PC GPUs.

Where have we seen this? What games? I had no idea DF or NX gamer was able to find a 48 CU RDNA 2.0 GPU and compared it against the PS5's 36 CU GPU. There have been no direct comparisons between the PS5, the 32 CU 6600xt and the 40 CU 6700xt.
 

Snake29

RSI Employee of the Year
If cerny was telling the truth on the clockspeed + cu matter why havent we seen an examples of his claims?
We have seen the opposite, examples showing the PS5s higher clocked GPU perform very close to lower clocked PC GPUs that of equal Tflops and similer architecture.

Tell me something else. Why haven't we seen the XSX running way better then the PS5 what MS has claimed?

I think this is the big lie in the console business, while the PS5 is constantly running better in games or equal.
 

Sosokrates

Report me if I continue to console war
Several people have posted examples in this very thread. You just chose to ignore them. Thats not Cerny's fault.


Where have we seen this? What games? I had no idea DF or NX gamer was able to find a 48 CU RDNA 2.0 GPU and compared it against the PS5's 36 CU GPU. There have been no direct comparisons between the PS5, the 32 CU 6600xt and the 40 CU 6700xt.

What examples, the hitman 3 one shows the opposite of cernys claims.
There is no example of the PS5 performing significantly better then a 5700xt in games without Raytracing.
 

Md Ray

Member
So why would somone believe what Cerny said regarding clockspeeds when real world examples are not showing this?
If TFlops was the be-all and end-all of measuring GPU perf, then this should not be happening in the real world example:

Control:
fAQPIfy.jpg


RE8:
JOqj3Wi.png


As I've been saying for quite a while, it depends from game to game and even scene to scene. Despite PS5 having less TF and bandwidth than XSX, we can see the higher clock speed is just enough to bring it on par XSX in some scenarios due to higher rasterization throughput.

Regarding 5700 XT vs PS5, there will be scenarios where PS5 will have a couple of fps advantage over 5700 XT and vice versa. Both have their advantages and disadvantages. 5700 XT has more bandwidth to play with, whereas the PS5 has higher FP32, pixel, texture rate, and rasterization throughput.
 
So you're going to ignore this? Why does 6600 XT have up to 11% higher fps here with just a 6% increase in TF and a much lower bandwidth?

Based on these two screens, I am getting a tflops difference of 4-6% but the tflops show a higher 10-11% increase.

The problem is that not every game shows this. Thats why the timespy comparison I showed was far more interesting. 11% performance improvement despite a 32 vs 40 CU disadvantage and possibly a 4-5% tflops advantage.
You both seem to be saying clocks give a 5% advantage by your numbers but keep saying 11% for some reason.
 

Loxus

Member
Technically there is nothing special about TLOU2. It has similer polycounts,texture quality to other gen 8 games, it uses screen space reflections and cube maps, its global illumination system is baked most of the time.
The reason why TLOU2 looks so good is because the talented artists at naughty Dog have spent a lot of time crafting the environments by hand.
Cerny gave an example that lower cus and higher clocks would produce significant performance improvements over higher cus and lower clockspeed.
I wish he would of shown a demo showing this because people have done this experiment with PC gpus and the higher clocked parts perform the same.
We also are not seeing the PS5 perform significantly better then a pc equipped with a 5700xt(without Raytracing) and the PS5s performance compared to the SeriesX is what you would expect for a gpu with about 18% less tflops.

So why would somone believe what Cerny said regarding clockspeeds when real world examples are not showing this?
Why you think Mark Cerny would lie about the PS5 is beyond me.

This is why this whole 5700XT discussion is retarded.
Cerny had 2 PS5 configurations.
36 CU @ 1GHz
48 CU @ 0.75GHz
And both resulted to 4.6 TF.
He found that the 36 CU config preformed better in conjunction with all the other PS5 parts then the 48 CU PS5.

Do you have a 48 CU PS5 to compare?
The 5700XT is not a fucking PS5 GPU. Why you guys compare a PC part to a custom GPU is beyond me, it would be common sense to compare 2 PS5's.
 
Last edited:

Sosokrates

Report me if I continue to console war
Tell me something else. Why haven't we seen the XSX running way better then the PS5 what MS has claimed?

I think this is the big lie in the console business, while the PS5 is constantly running better in games or equal.
Microsoft have never claimed the XSX runs way better then the PS5.

PS5 games ran better when XSX tools were half baked. Its been a while since a PS5 game has ran better, apart from the raytracing bug in little nightmares 2 on xsx.
 

SlimySnake

Flashless at the Golden Globes
You both seem to be saying clocks give a 5% advantage by your numbers but keep saying 11% for some reason.
5% is the tflops difference. 10.2 vs 10.6 tflops at the respective 1.99 and 2.6 ghz clocks.
11% is the fps difference. 67 vs 61 fps.
it should be 1:1 so the difference should be 5% of 61 fps or 64 fps. not 67%. that means the higher clocks are offering an extra 6% in performance.
 

Md Ray

Member
Why you think Mark Cerny would lie about the PS5 is beyond me.

This is why this hold 5700XT discussion is retarded.
Cerny had 2 PS5 configurations.
36 CU @ 1GHz
48 CU @ 0.75GHz
And both resulted to 4.6 TF.
He found that the 36 CU config preformed better in conjunction with all the other PS5 parts then the 48 CU PS5.

Do you have a 48 CU PS5 to compare?
The 5700XT is not a fucking PS5 GPU. Why you guys comparing a PC parts to a custom GPU is beyond me, it would be common sense to compare 2 PS5's.
Exactly, the 5700 XT has the entirety of its 448 GB/s bandwidth to itself, while the PS5 has to share a significant chunk of its 448 GB/s with the CPU.

Here's the CPU memory bandwidth of 4700S (PS5 SoC):
E_PJKHpWQAEo-lx

Source
 

Sosokrates

Report me if I continue to console war
Why you think Mark Cerny would lie about the PS5 is beyond me.

This is why this hold 5700XT discussion is retarded.
Cerny had 2 PS5 configurations.
36 CU @ 1GHz
48 CU @ 0.75GHz
And both resulted to 4.6 TF.
He found that the 36 CU config preformed better in conjunction with all the other PS5 parts then the 48 CU PS5.

Do you have a 48 CU PS5 to compare?
The 5700XT is not a fucking PS5 GPU. Why you guys comparing a PC parts to a custom GPU is beyond me, it would be common sense to compare 2 PS5's.

Its based on RDNA2 architecture which feature set is malleable, sony decided to remove hardware VRS2 and mesh shading support and decided to go with the older RDNA1 VRS1 and primitive shaders.
Aside from these new feature sets and power consumption improvement RDNA1 AND RDNA2 perform very similar in games not using Raytracing. So while sony did choose which features of RDNA2 they want, it does not make the PS5s GPU that much different to a 5700xt especially when trying to see if higher clocks and low cus make as much of a difference as cerny claimed.
 

Md Ray

Member
No. If tfops is 6% higher on the 6600xt you have to ignore that first 6% advantage. Advantage is 11%-6% is 5%.
So the clocks are giving 5% better results then higher cu count.
In the case of CP2077, and FS 2020, yes. Maybe it would've gotten more than a 10-11% increase if the bandwidth were identical between both cards.
 

SlimySnake

Flashless at the Golden Globes
What examples, the hitman 3 one shows the opposite of cernys claims.
There is no example of the PS5 performing significantly better then a 5700xt in games without Raytracing.
So one game, and the 5700xt. No rdna 2.0 cards. You made it sound like we had lots of games showing direct comparisons between the ps5 and the higher CU lower clocked rdna cards.

Here are some comparisons of an rdna 2.0 card with 52 CU and 1.825 ghz and how it performs compared to the ps5. This is a far bigger sample size, and we shouldnt even take this as gospel.

Y8WqMbp.jpg


The 6700xt and 6600xt should be compared to the PS5 and XSX. Thats what DF should be doing. We will get a much more clearer picture then.
 

3liteDragon

Member
sony decided to remove hardware VRS2 and mesh shading support and decided to go with the older RDNA1 VRS1 and primitive shaders.
No they didn’t, they went with their own solution (patents) since RDNA 1’s primitive shaders feature didn’t allow devs to have “full programmatic control” like Cerny says their custom solution does, over the geometry pipeline.
 
Last edited:

SirVoltus

Member
I was talking about the performance mode, not the visual quality one. The visual quality mode has a fluctuating framerate.
Oh I see.
Anyway i think the one that techpowerup posted on their pc review is max. graphic setting so that is not comparable to ps5 performance mode then.
 
In the case of CP2077, and FS 2020, yes. Maybe it would've gotten more than a 10-11% increase if the bandwidth were identical between both cards.
I know this isnt exact comparison but in cyberpunk at 1080p medium the 5700xt performs a little better then the % tflop differences of the 5600xt. The 5700xt has more cu higher clocks and double the bw.
 
So one game, and the 5700xt. No rdna 2.0 cards. You made it sound like we had lots of games showing direct comparisons between the ps5 and the higher CU lower clocked rdna cards.

Here are some comparisons of an rdna 2.0 card with 52 CU and 1.825 ghz and how it performs compared to the ps5. This is a far bigger sample size, and we shouldnt even take this as gospel.

Y8WqMbp.jpg


The 6700xt and 6600xt should be compared to the PS5 and XSX. Thats what DF should be doing. We will get a much more clearer picture then.
out picking cherries i see hahaha i kid i kid.
 

FireFly

Member
nah. Most ingame benchmarks ive seen has the 5700xt card running at around 1.95-2.0 ghz during gameplay. It's the same as my rtx 2080 and pretty much every nvidia card starting from Pascal. They run well over even the advertised boost clocks.
Are these reference cards? From the 5700 XT TechPowerup review:

 

SlimySnake

Flashless at the Golden Globes
out picking cherries i see hahaha i kid i kid.
Like i said it shouldnt be taken as gospel because these 0.1% minimum framerate tests Alex does are simply inaccurate. PC Benchmarks always use average framerates as the standard way to compare two different graphics card. Alex finds the worst point in games and uses them to base his comparisons. its just completely inaccurate for what hes trying to do.

The funny thing is that if he had done that in the control comparison, i.e., just take one point of comparison, we would not have gotten an almost 16% difference between the two consoles in the 20 or so screenshots he ended up taking. So he clearly knows what he needs to do but doesnt follow his own standards 99% of the time.
 

Sosokrates

Report me if I continue to console war
No they didn’t, they went with their own solution (patents) since RDNA 1’s primitive shaders feature didn’t allow devs to have “full programmatic control” like Cerny says their custom solution does, over the geometry pipeline.
I guess Alex from DF is incorrect on that one then, in a recent df weekly he said ps5 uses primitive shaders from RDNA1.
 

Snake29

RSI Employee of the Year
What examples, the hitman 3 one shows the opposite of cernys claims.
There is no example of the PS5 performing significantly better then a 5700xt in games without Raytracing.

And even Hitman 3 isn’t showing a big difference at all, since the performance is more stable on the PS5. I do not even believe much from this game graphical wise. If they would revisit the game again i bet Hitman 3 would be the same on both platforms. Hitman 3 is showing that the game itself is the problem (maybe engine), not the PS5.
 
Last edited:

Sosokrates

Report me if I continue to console war
And even Hitman 3 isn’t showing a big difference at all, since the performance is more stable on the PS5. I do not even believe much from this game graphical wise. If they would revisit the game again i bet Hitman 3 would be the same on both platforms. Hitman 3 is showing that the game itself is the problem (maybe engine), not the PS5.

Point is the 5700xt out performs the PS5 in that test. I would of expected it to perform better given what cerny said about the significant performance increase that higher clockspeed produces.
However the xsx also underperforms compared to the 5700xt imo.
 
Last edited:

Md Ray

Member
Point is the 5700xt out performs the PS5 in that test. I would of expected it to perform better given what cerny said about the significant performance increase that higher clockspeed produces.
Again...
Regarding 5700 XT vs PS5, there will be scenarios where PS5 will have a couple of fps advantage over 5700 XT and vice versa. Both have their advantages and disadvantages. 5700 XT has more bandwidth to play with, whereas the PS5 has higher FP32, pixel, texture rate, and rasterization throughput.

5700 XT has higher bandwidth than PS5.
 

FireFly

Member
If we use 3D Center's meta summary of all the 6600 XT reviews then the 6600 XT is 5.6% faster at 1080p.



So even assuming the majority of sites were using 2 GHz clocked 5700 XTs vs 2.6 GHz clocked 6600 XTs, that's 10.24 TF vs. 10.65 TF, or 4% faster. So you get 1.6% more performance than expected. And given the way the IC works, that may be due to the 6600 XT having more bandwidth at 1080p. (Based on AMD's slides the hit rate is 55% at 1080p, so 55% of the time you get up to a 3.25x bandwidth boost).
 

GametimeUK

Member
I really appreciate Alex for these videos. Since I own all all the consoles and a PC it's really nice to be given the console settings to use as a baseline and tweak accordingly. It really helps with what version of a game I should be buying and how much better my game will look / perform on PC.
 
Top Bottom