• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Deathloop PC vs PS5, Optimised Settings, Performance Testing + More

Md Ray

Member
Oh i know that rtx lights is where the good shit is, i was never impressed by reflections\shadows.

But are console capable of doing great rtx lights in nextgen games without sacrificing too much?!

From my understanding, good rtx lights are crazy heavy, i tried on my pc with cyberpunk and control but even with dlss i wasnt able to get 60 frames at high resolution with a 2070super, a gpu comparable with a ps5, and console don't even have dlss...
Metro Exodus EE with those RT lighting and GI even runs on Series S, with heavy res downgrade, of course.
 

GymWolf

Member
Insomniac games prove that you can put in accurate RT reflections in a GREAT LOOKING non cross gen game and at 60FPS! Consoles can pull off RT but devs needs to be smart about it.
They are more accurate than fake reflections, but they are still not 1:1 if you look close.

And like i said, morales is a crossgen game and ratchet is a relatively small game with non photorealistic graphic.

Let's see if horizon forbidden west, a crossgen but vastly better looking than morales and far bigger game than ratchet is gonna use rtx and how...

And you don't know how much they are gonna cut to include rtx reflections, is the trade off really worthy?! We will never know...

Personally, i'm perfectly fine with faker reflections since i don't stop swinging in spiderman mid air to control if reflections are accurate or not and i onestly feel sad for whoever does that in a game like spiderman or with a fast paced combat.

Accurate reflections belongs to photomode until we have enough power to spare to not sacrifice anything to include them.

Rtx lights are far more scene changing and you can forget that on console on nextgen games.
 

Neofire

Member
I'm still trying to figure out the point of a ps5/PC comparison..... You have a 3000 dollar plus pc(with 600-2000 dollar GPUs alone) rig versus a 500 dollar system 👀. To me it serves no purpose 🤷🏿‍♂️ when everyone should already know the obvious.
 

Armorian

Banned
your math is wrong
PS5 beats all Pascal cards in raster closest being 2080Ti
PS5 beats all RDNA1 cards in raster closest being RX 5700XT

Look where 5700XT and 2080ti are on this picture in RASTER performance:

EuY58Hm.png


PS5 is below 2070S and above 2070 compared to nvidia cards and ~5700XT versus AMD cards.

If you have PROOF that says otherwise, share it with us.
 

assurdum

Banned
Most 5700 XT GPUs (not the shitty reference cooler) average around 2000-2050Mz. Which is around 10.2-10.5 TFLOP, almost exactly a PS5 GPU. PS5 is likely a little bit faster due to the higher clock speed and lower amount of CUs that need to be kept fed, but the 5700 XT doesn't have to share memory bandwidth so the difference will be minor.

Saying a PS5 is roughly equal to or slightly better than a 5700 XT in non-RT makes sense.
Wasn't the base frequency of 5500XT 1,66Mhz? And from what I have seen ps5 outperform it very often in pure rasterisation. Doubt ps5 will stay at the same level of performance for the whole generation.
 
Last edited:

FireFly

Member
Why when ps5 outperform 5700XT is because bad pc optimization but when it's the same performance it's the only truthful scenario? Based on what? GPU frequency it's quite higher on ps5 but as always just count the specs which fit the favourite narrative.
What benchmarks do we have showing the PS5 outperforming a 5700 XT?

The 6600xt with its 2.6 ghz clockspeeds and the 5700xt with its 1.98 ghz ingame clockspeeds are a good comparison. Thats a roughly 10.2 tflops gpu vs a 10.6 tflops GPU. The 6600xt is bottlenecked by a 128bit memory interface but that most only plays at higher resolutions.
So at 1.98 GHz vs the 2.64 GHz in the TechPowerup review, we're talking about 10.81 TF vs 10.14 TF. That's a 6.6% advantage. But I don't think we can just assume the 5700 XT being used in the review was averaging those clockspeeds. How do we know they weren't using the reference 5700 XT they used for their original review, which averaged 1.88 GHz?
 
Last edited:

Darius87

Member
Look where 5700XT and 2080ti are on this picture in RASTER performance:

EuY58Hm.png


PS5 is below 2070S and above 2070 compared to nvidia cards and ~5700XT versus AMD cards.

If you have PROOF that says otherwise, share it with us.
Raster calculated ROP's * Clock = Gpixel/s your graph is relative performance.
 

assurdum

Banned
What benchmarks do we have showing the PS5 outperforming a 5700 XT?


So at 1.98 GHz vs the 2.64 GHz in the TechPowerup review, we're talking about 10.81 TF vs 10.14 TF. That's a 6.6% advantage. But I don't think we can just assume the 5700 XT being used in the review was averaging those clockspeeds. How do we know they weren't using the reference 5700 XT they used for their original review, which averaged 1.87 GHz?
Ubisoft games or games AMD promote typically perform better on ps5 but sure I'm ready to read the bad pc port excuse. Still I wait to see when the generation will go further, because I doubt this narrative will survive in a couple of years. Console hardware leans to be pushed better with the years passed.
 
Last edited:

Md Ray

Member
I'm still trying to figure out the point of a ps5/PC comparison..... You have a 3000 dollar plus pc(with 600-2000 dollar GPUs alone) rig versus a 500 dollar system 👀. To me it serves no purpose 🤷🏿‍♂️ when everyone should already know the obvious.
The point is to inform gamers on PC about optimal settings which more often than not lines up with the settings devs use on consoles that offer the 'best bang for your buck' i.e. not sacrificing too much fps while also making the game look close to Ultra settings. It helps those using low-mid range hardware to get the most out of their rigs. It might even help those with high-end rigs if they're targeting 120/144fps.
 
Last edited:

Neofire

Member
The point is to inform gamers on PC about optimal settings which more often than not lines up with the settings devs use on consoles that offer the 'best bang for your buck' i.e. not sacrificing too much fps while also making the game look close to Ultra settings. It helps those using low-mid range hardware to get the most out of their rigs. It might even help those with high-end rigs if they're targeting 120/144fps.
Don't they have benchmarks for that?
 
The 6600xt with its 2.6 ghz clockspeeds and the 5700xt with its 1.98 ghz ingame clockspeeds are a good comparison. Thats a roughly 10.2 tflops gpu vs a 10.6 tflops GPU. The 6600xt is bottlenecked by a 128bit memory interface but that most only plays at higher resolutions.

lgczcsyx5lg71.jpg


Timespy benchmarks give it a lead of 11%.



Typically, tflops are tflops, but I always found it interesting that Nvidia GPUs enjoyed a significant lead over AMD GPUs until RDNA when AMD was finally able to increase the clockspeeds to 1.8-1.98 ghz and all of a sudden, they started to match the standard rasterization performance of equivalent Nvidia graphics cards. Nvidia GPUs starting from Pascal had always been very high at around 1.7-2.0 ghz. My rtx 2080 has hit 2050 mhz even though the boost clocks were supposed to max out at 1.7 ghz according to specs.

Now the 6600xt hits 2.6 ghz and is able to offer 11% more performance despite 8 fewer CUs and just a minor 3% increase in tflops.
Your graph shows a 3% increase.
It's about pushing both CPU + GPU (visuals) for them. I've seen the leaked TLoU 2 builds where they had CPU (and GPU) utilization metric showing on screen, you'd be surprised to see just how many simple scenes during gameplay, with not a lot going on in them, were using 25-28 ms of CPU time constantly... So between 40-35fps. Only during the cut-scenes the CPU would shoot up to 16-15 ms (60+fps), only to be limited by the GPU which would generally take around 20-25ms (50-40fps) of GPU time.

In busier sections with combat and gun fight there were even occasions where the PS4's GPU had headroom to push closer to 50fps, but it's the CPU that was dragging the framerate down to 30fps and even below that.
Your looking at it like they designed the game on trash cpu then you drop in a good cpu. If the PS4 has good cpu from the get go, I'm confident ND would still push visuals and we would still have a 30fps game.

PS5 has never been nor could be bench marked.
Differences on are the developers.

As noted above... the PS5 GPU is SIGNIFICANTLY better than a last gen 5700 xt.
You don't get blood from a stone.

Do you understand RDNA vs RDNA 2? Or are you going to continue to ignorantly try and spin it away?
No games have shown that yet.
 

Md Ray

Member
Your looking at it like they designed the game on trash cpu then you drop in a good cpu. If the PS4 has good cpu from the get go, I'm confident ND would still push visuals and we would still have a 30fps game.
No, I said it's not just about visuals. They also push the CPU side of things just as much which involves audio, animations, physics and whatnot.
 
Last edited:

RoboFu

One of the green rats
It’s kind of funny that Deathloop of all things is what some people want to sink their fanboy flags into. 😂
 
Last edited:
i2nceGv.jpg

This from

And from techpowerup.. those red circle that's where ps5 gpu territory
4Zt4aZQ.jpg

On PS5 the game is locked at 60fps 99% of the time (at mostly around >1400p according to both elanalista and VGTech). Usually you need an average quite higher than 60fps in order to have such consistency (maybe like ~70fps). You need to look at 1% min framerate, not average to compare using PC benchmarks.
 
Wait till Naughty Dog does their magic. Then decide if a ps5 pro is needed.

ps5 pro would benefit all games, including the ultra elite devs.
It doesn't though.
Only foolish PC fanboys compare off the shelf parts to custom parts.

5700xt is already outdated where PS5 will continue to put out award winning visuals.

Stop. You just sound silly. The devs are putting out award winning visuals. Cerny didn't personally create some holy grail of graphical computing using unicorn farts. You just sound like a silly fanboy. It's a piece of silicon. It's a good performer for the price but it's not anywhere close to top tier pc gaming performance. Get over it.
 

Dream-Knife

Banned
If it's not about performance what is it about?

Oh nothing. My mistake. PC gaming = ps5 gaming but more expensive, so the only reason to play on pc is if you like to waste more money...
PC gaming is great because of freedom.
Look at Steam Deck. Freedom in a handheld. It's less powerful than other options, but you have freedom.

Of course materialistic people enjoy pc because there is always something new to upgrade to in an attempt to fill the hole in their soul, but that doesn't mean you need to if you just enjoy using it.



show me a benchmark of a game running worse on an RDNA1 than on an RDNA 2 card with the same TF performance
like a 5700XT vs a 6600XT, they are close enough

edit:
1puj0r.png


2l2jxt.png


3otj9k.png


480kbj.png


5ywkeh.png


notice how the 6600 XT boosts way higher, debunking the high clocks = better performance myth... but ultimately has similar TF performance, and can in fact boost higher and reach about 10.6TF while the 5700 XT only reaches about 9.7TF when boosting to its max clock (although there are variants that boost close to 10.2TF)
Thank you for this. A few weeks ago there was a thread where people were claiming that the pa5 had some magic due to its clock speed.
"better perf myth"... It depends from game to game. Why omit CP2077 and Flight Sim results which show up to 11% higher fps on 6600 XT despite the avg. TF difference seems to be around 6% between 5700 XT (9.9 TF) and 6600 XT (10.6 TF)? Because that would go against your narrative.

11% higher avg. fps
FvENg9z.png


10% higher avg. fps
fpAX0Br.png


The 6600 XT also has a massive bandwidth disadvantage compared to 5700 XT, so games favoring more BW will see 5700 XT pulling ahead of 6600 XT. You didn't debunk anything here.
What about infinity cache?
 

ToTTenTranz

Banned
Nah, RDNA2 has ZERO IPC gains over RDNA 1.0. They are virtually identical aside from better power consumption and ray tracing capabilities.

RDNA 1.0 had a 25% IPC gain over Polaris and Vega. Polaris had a 25% IPC gain over GCN 1.0 or whats in the base PS4. RDNA 2.0 had no gains other than perf per watts which does not improve performance per clock.


There's zero Instructions Per Clock advantage, but by clocking significantly higher the RDNA2 GPUs get higher performance than RDNA1 GPUs with execution units of similar count/width.

The PS5 is faster than the 5700XT due to 15% higher texture and pixel fillrate, while the cache scrubbers and faster access to I/O compensate for the lower available bandwidth.
 

01011001

Banned
DF are comparing PC and a console?

Didn't they say they wouldn't do this? Or was that just for Xbox?

seems like the usual suspects have misunderstood the video entirely, the PS5 comparison is dont in order to find PC settings that save performance and still look very good. hence his suggested settings one should run basically are the PS5 settings.

nothing in this video is made to compare a PC with the PS5, only to show how PS5 settings can lift up your PC performance over the often unnecessary Ultra settings
 
Last edited:

assurdum

Banned
i2nceGv.jpg

This from

And from techpowerup.. those red circle that's where ps5 gpu territory
4Zt4aZQ.jpg

Again El Analist why people his FPS analysis are completely unreliable let's not talk of the native resolution or the graphic setting he tried to extrapolate.
 
Last edited:
seems like the usual suspects have misunderstood the video entirely, the PS5 comparison is dont in order to find PC settings that save performance and still look very good. hence his suggested settings one should run basically are the PS5 settings.

nothing in this video is made to compare a PC with the PS5, only to show how PS5 settings can lift up your PC performance over the often unnecessary Ultra settings
That and you can compare in this game since drs is available and console isn't using some unavailable CB resolution. Non RT mode of course since PC RT can't scale that low.
 

Md Ray

Member
Yep. Just for Xbox. Also MS recently asked them not to compare their games against PC. Their games running on Xbox of course, when it's their games, here Deathloop, running on PS5, not a problem. Do as you wish.
IIRC, it was only Asobo studios (FS 2020 devs) who asked them to not share the entirety of the spreadsheet, not Microsoft.

In the end, we did get the Xbox settings for Flight Sim in that NVIDIA sponsored vid.
 

Loxus

Member
Nice example. Cerny mentioning higher clock speed giving significant gains is pretty much a marketing lie.
Since when a man that design multiple consoles and made multiple patients a liar?

So far everything he said about the PS5 is true.
-Ray Tracing, Decompression, SSD Speed, 3D Audio, Extreme Cooling, etc.

We all know from looking at the PS3 and PS4 how games start to look when they take full advantage of the consoles, and PS5 is no different.

Also, the PS5 is a budget device, comparing it to GPUs that cost much more than the console is just plain retarded. Especially when you take in account for the SSD cost.

The PS5 potential is the sum of it's parts, not just the GPU. It's the developers not optimizing their games to take advantage of the PS5 features is why we have these stupid discussions.

Just like the PS4 having TLOU 2, the PS5 will punch above it's weight and produce games in similar fashion.
 

01011001

Banned
Since when a man that design multiple consoles and made multiple patients a liar?

since he said higher clocks + narrow design is a huge benefit, which it isn't for GPUs, as demonstrated by easy PC tests that show basically no gain in performance when running a narrow high clock GPU vs a wide slower clock GPU that both are pushing the same amount of FLOP/s

I literally posted several screenshots proving this in this thread, where an RDNA1 GPU has almost identical performance compared to an RDNA2 GPU that is in the same ballpark in terms of FLOP/s, and actually the slower clocked RDNA1 card technically had a slight disadvantage in terms of FLOP/s and still managed to basically be on par.

Cerny wants to sell you consoles, so of course he uses marketing speak to tickle fanboy balls
 
Last edited:
Top Bottom