• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF:Call of Duty Black Ops Cold War - PS5 vs PC - Settings And Performance Analysis

Ev1L AuRoN

Member
Dlss is a bit unfair comparison. I mean of course it should be used as its what the card can do.

But similarly a console wouldn't run games at 4k and waste resources and may use something like checkerboarding or insomniacs temporal injection.

So maybe we should compare something like spiderman using temporal injection, 60 fps and rt... and visuals that look so close to actual 4k.
It is a tech that Nvidia spent time making and silicon implementing, it's a core part of the architecture and is unfair to ignore it specially when you have potentially better IQ as a result.

Most likely you could crank everything to ultra and still have better performance with DLSS, let's not ignore that the console version have Dynamic 4k and a key element, alphas that hit hard the memory bandwidth at a lower setting than the lowest on PC, misleading is to say PS5 is in the same ballpark which is clearly not.
 

Arioco

Member
Testing with PS5 vsync on vs no vsync on PC? 🤔. Has this been explained? Because that is apples and oranges.


Does vsync have an impact on frame rate? Or any other impact on the metrics that makes the comparison unfair?

Also, do we know for sure if DF is comparing this game with vsync off on PC?


Thanks in advance.
 

ethomaz

Banned
Does vsync have an impact on frame rate? Or any other impact on the metrics that makes the comparison unfair?

Also, do we know for sure if DF is comparing this game with vsync off on PC?


Thanks in advance.
Yes VSync impact framerate around 5-8%.... depend of the game... some games drops 20% in performance with VSync on.
 
Last edited:
It is a tech that Nvidia spent time making and silicon implementing, it's a core part of the architecture and is unfair to ignore it specially when you have potentially better IQ as a result.

Most likely you could crank everything to ultra and still have better performance with DLSS, let's not ignore that the console version have Dynamic 4k and a key element, alphas that hit hard the memory bandwidth at a lower setting than the lowest on PC, misleading is to say PS5 is in the same ballpark which is clearly not.
They specifically chose a section where the PS5 version is bugged to get a like for like comparison at native 4k. If you wanna hand the PC gpu DLSS you have to give the PS5 Dynamic res with reconstruction tech. I would say yes it is indeed in the ball park of some pretty significant PC gpus min 2070 max 3060ti
 

Blizzje

Member
The key takeaway from the video is at 11m 50s.

Listen:





Effects on ps5 appear to be rendered at quarter resolution but full on pc thus making it an apple vs orange comparison as that setting alone at such high resolution [at 4k] could net you at least 10 % extra performance on pc.


rxdlyA0.png

What is your source for the alleged 10 percent extra performance? The effects setting has an impact lower than 2 percent according to game debate:

'The rest of the graphics settings don't make a huge performance impact, like Subsurface Scattering at 1.94%, Special Effects Quality at 1.48%, Weapon Shadow at 1.24%, Order Independent Transparency at 1.16%, and Volumetric Lighting, Special Effects Shadows, and Object View Distance all at 0.7%, 0.54%, and 0.47% respectively.'

b4xGMl2.png

Also, if you look at the scene presented, the difference in fps is equally high when there are no (alpha) effects on screen. I'm all for an equal comparison, but it looks to me like the effects have a very small impact. They do make an impact on bandwidth but I dont think thats the argument here.
 
What is your source for the alleged 10 percent extra performance? The effects setting has an impact lower than 2 percent according to game debate:

'The rest of the graphics settings don't make a huge performance impact, like Subsurface Scattering at 1.94%, Special Effects Quality at 1.48%, Weapon Shadow at 1.24%, Order Independent Transparency at 1.16%, and Volumetric Lighting, Special Effects Shadows, and Object View Distance all at 0.7%, 0.54%, and 0.47% respectively.'

b4xGMl2.png

Also, if you look at the scene presented, the difference in fps is equally high when there are no (alpha) effects on screen. I'm all for an equal comparison, but it looks to me like the effects have a very small impact. They do make an impact on bandwidth but I dont think thats the argument here.

Console version effects are rendered at super low resolution and are way below PC lowest setting option as they can be very demanding especially when native rez is 4k.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I don’t understand, what’s causing this? Praise the lord for gsync if this is true, I always have vsync off.
vsync eliminates tearing and thats usually very expensive. i dont know the reasoning behind it though. its always been like this.

thats why gsync is so important. no effect on the GPU, the monitor just eliminates the tearing for you even if the frames dip below the monitor's refresh rate. VRR does the same.
 

Fredrik

Member
vsync eliminates tearing and thats usually very expensive. i dont know the reasoning behind it though. its always been like this.

thats why gsync is so important. no effect on the GPU, the monitor just eliminates the tearing for you even if the frames dip below the monitor's refresh rate. VRR does the same.
Isn’t it just waiting for the next screen update?

Anyway I went and turned on vsync on Cyberpunk just now and I see no performance loss there at least. The framerate is still fluctuating between 61 to 77fps going by the counter. But I set the vsync to 144, there are choices for 72,36 etc too. Is the performance loss about where the ”cap” is set? Vsync 72 would drop the highest fps to 72, cutting off 5 fps in my case when it’s at the highest. Vsync 36 would halve the framerate.
 

assurdum

Banned
Yet again, the PS5 performs exactly as expected when you look at the raw hardware numbers. Where's that Cerny sauce?

And no, it's not performing like a 2080S. High resolution particle effects at 4K can really tank the performance, which makes the comparison not exactly fair since PS5 uses lower settings than the lowest possible on PC.
DF stops to use the Xbox machine in the pc graphic comparison and you think it's ps5 which is overestimated? Really?
 
Last edited:

NXGamer

Member
I’m speculating the comparison is bogus if PS5 is vsync locked or employs an aggressive adaptive vsync. A ~45fps avg read indicates this to be the case. If vsync was off the PS5 could e.g. be reading in the 50’s - and that’s with a 60hz cap!

NXGamer NXGamer are you able to comment and correct me here if I’m misunderstanding something....
Correct, V-sync does have a cost, this is all dependant on the engine, state, Refresh Target etc at the time but you will "gain" a higher rate as the engine and GPU can just flip a frame as fast as it can without any sync-states on the display.

This is why Free-Sync, VRR, G-Sync exist as they attempt to make the Display match the GPU rather than making the GPU match the Display.
 
Last edited:

Bo_Hazem

Banned
Xbox Series X is also missing a lot of effects that they didnt cover in their comparison video. Smoke effects on guns and several other fire effects were missing along with the rt shadows being broken.

I meant the XSX shadow was pretty fucked, so edited that to avoid confusion, of course with other problems you've mentioned here.
 

rodrigolfp

Haptic Gamepads 4 Life
Just one setting. Just like there's one setting on PS5 that's higher than PC's Ultra here.
Which one again? Can´t find anymore on the video.
The key takeaway for me is that PS5 perform better than expected for the money. Pre-launch I believe it was said to perform like a 2070, now we’re between 3060ti and 3070.

That said, this is a simple form of raytracing so it’ll be interesting to see what happens with upcoming raytracing titles since AMD has so far struggled in this area. Things are moving quickly too so we’ll see if this gen will push back console gaming into 30fps again or if devs will choose to scale back gfx/rt instead.


3080 only have 10gb now so it’ll have to be the rumored super/ti in that case, which could go for $1100-1300 unless nvidia starts dropping prices.
The key takeaway for me is that pc performs worse than expected, just as with other CoDs.
 
Last edited:
I've never noticed any performance drop with v-sync. In a theoretical situation where your GPU could put out a perfect and exact 60fps without v-sync would it be lower than 60fps with v-sync on?

Anyhow I can see tearing in this video so PS5 must be using adaptive sync which would shut off below 60fps so would surely have no impact on performance.

It's not ideal due to the PS5 using low quality effects that the PC can't replicate but this is a pretty good comparison I'd say. That said the PS5 normally uses DRS and anyone with an Nvidia card would use DLSS so it's not exactly a real world comparison.
 

rodrigolfp

Haptic Gamepads 4 Life
Testing with PS5 vsync on vs no vsync on PC? 🤔. Has this been explained? Because that is apples and oranges.

Also the end punctuated with card not being used because DLSS is off? Err that’s an upscale tech, and even with its below 60 in the footage. But comparing to native 4k PS5 without upscale?!? Guy cant help himself.
Any past CoD drops framers with Vsync? The last game I experienced drops (and massive ones) was HZD (not surprising, very bad port).
 
Last edited:

ethomaz

Banned
Isn’t it just waiting for the next screen update?

Anyway I went and turned on vsync on Cyberpunk just now and I see no performance loss there at least. The framerate is still fluctuating between 61 to 77fps going by the counter. But I set the vsync to 144, there are choices for 72,36 etc too. Is the performance loss about where the ”cap” is set? Vsync 72 would drop the highest fps to 72, cutting off 5 fps in my case when it’s at the highest. Vsync 36 would halve the framerate.
What is the refresh rate of your monitor? VSync should be set to the refresh rate of your monitor and your framerate should be a bit over it to works... if your framerate is below the refresh rate set it will not work.

So let's guess you have a 60Hz monitor... so VSync will force the game with work always at 60fps... for that your game should be running always over 60fps with a bit of overhead to account the performance hit turning VSync on... so you need depending of the game at least over 63-65fps all the time.

If you match that you will have VSync on that will get rid of screen tearing.

I give an example... the VSync works with multiples like you can set VSync in a 60Hz and have the framerate a bit over 30fps... because 60 is divisible by 30... in that case each frame will be duplicated to reach 60 frames for the refresh rate of the monitor... that is probably why it shows these options to you: 36, 72, 144... they are all multiples of your monitor refresh rate (144Hz).

With the exemple of framerate you give it will only works VSync 32... drops to 61 won't allow VSync 72 to work... you will need at least mininum 75-80 fps.
 
Last edited:
I've never noticed any performance drop with v-sync. In a theoretical situation where your GPU could put out a perfect and exact 60fps without v-sync would it be lower than 60fps with v-sync on?

Anyhow I can see tearing in this video so PS5 must be using adaptive sync which would shut off below 60fps so would surely have no impact on performance.

It's not ideal due to the PS5 using low quality effects that the PC can't replicate but this is a pretty good comparison I'd say. That said the PS5 normally uses DRS and anyone with an Nvidia card would use DLSS so it's not exactly a real world comparison.
Right but at the end of the video, 15:23, Alex shows what the 2070S does with DLSS and its about 53FPS avg. When the PS5 isnt bugged and runs with dynamic resolution its a locked 60 a min of 10% better than the 2070S but who knows the actual gap because the PS5 is capped at 60 fps. Even if you add a few percent because of the alpha effects that were listed above the PS5 is still outperforming the 2070S and is more inline with 2080S. Pretty impressive stuff either way.
 

Fredrik

Member
What is the refresh rate of your monitor? VSync should be set to the refresh rate of your monitor and your framerate should be a bit over it to works... if your framerate is below the refresh rate set it will not work.

So let's guess you have a 60Hz monitor... so VSync will force the game with work always at 60fps... for that your game should be running always over 60fps with a bit of overhead to account the performance hit turning VSync on... so you need depending of the game at least over 63-65fps all the time.

If you match that you will have VSync on that will get rid of screen tearing.

I give an example... the VSync works with multiples like you can set VSync in a 60Hz and have the framerate a bit over 30fps... because 60 is divisible by 30... in that case each frame will be duplicated to reach 60 frames for the refresh rate of the monitor... that is probably why it shows these options to you: 36, 72, 144... they are all multiples of your monitor refresh rate (144Hz).

With the exemple of framerate you give it will only works VSync 32... drops to 61 won't allow VSync 72 to work... you will need at least mininum 75-80 fps.
Ah okay, the way I read it it seemed like it was demanding for the GPU to vsync, with a 5-20% performance loss. That would be brutal. But if it’s essentially a cap we know DF don’t have vsync on here since both are fluctuating at framerates not evenly divided with the screen refresh, if I get you right we would then see a blocky graph going between 60 and 30 and not 54 and 47 etc.
 

ethomaz

Banned
Ah okay, the way I read it it seemed like it was demanding for the GPU to vsync, with a 5-20% performance loss. That would be brutal. But if it’s essentially a cap we know DF don’t have vsync on here since both are fluctuating at framerates not evenly divided with the screen refresh, if I get you right we would then see a blocky graph going between 60 and 30 and not 54 and 47 etc.
It indeed has small or big hit in performance depending of the game/engine... some games even reach 20% drop using VSYNC but most of them are in 5-10% level.

Of course the hardware being way stronger can mitigate that performance hit.
 
Last edited:

DJ12

Member
When I have vsync on in Red Dead Redemption 2, I often hover around 57~60 (or at least when I last played it about a year ago, with it off it was mostly above 60 by 1 or 2 frames.

Pretty annoying really, under 60 with it on, tearing with it off
Right but at the end of the video, 15:23, Alex shows what the 2070S does with DLSS and its about 53FPS avg. When the PS5 isnt bugged and runs with dynamic resolution its a locked 60 a min of 10% better than the 2070S but who knows the actual gap because the PS5 is capped at 60 fps. Even if you add a few percent because of the alpha effects that were listed above the PS5 is still outperforming the 2070S and is more inline with 2080S. Pretty impressive stuff either way.
Not mentioned in the video, is he used his best CPU, while mentioning some RT features hit the CPU heavily also.

A fair comparison is for him to keep his "PS5 equivalent" CPU in there, and then test the GPUs I doubt the delta would be anywhere near as large with a slower CPU in his PC. He's using a CPU that retails at a higher price than you can pick up a PS5 for, hardly apples to apples despite how keen he is to suggest it is.
 
Last edited:
Does vsync have an impact on frame rate? Or any other impact on the metrics that makes the comparison unfair?
Yes

If your framerate has a max of 60fps but the hardware renders it at 80 for a little while... It would not count it.

Also, if the hame drops to 45fps it would count as 30fps if vsync is enabled.

You can't compare hardware performance if you don't measure systems in the same manner.

However pointing out different options and measuring their impact seems fair to me (just don't say that it's telling us which hardware performs "better").
 
Last edited:
What is the refresh rate of your monitor? VSync should be set to the refresh rate of your monitor and your framerate should be a bit over it to works... if your framerate is below the refresh rate set it will not work.

So let's guess you have a 60Hz monitor... so VSync will force the game with work always at 60fps... for that your game should be running always over 60fps with a bit of overhead to account the performance hit turning VSync on... so you need depending of the game at least over 63-65fps all the time.

If you match that you will have VSync on that will get rid of screen tearing.

I give an example... the VSync works with multiples like you can set VSync in a 60Hz and have the framerate a bit over 30fps... because 60 is divisible by 30... in that case each frame will be duplicated to reach 60 frames for the refresh rate of the monitor... that is probably why it shows these options to you: 36, 72, 144... they are all multiples of your monitor refresh rate (144Hz).

With the exemple of framerate you give it will only works VSync 32... drops to 61 won't allow VSync 72 to work... you will need at least mininum 75-80 fps.
Thats adaptive sync you described. Cod on consoles could be and should be using that as it has no impact on performance other then tearing whenever fps drops below screen refresh and acting like regular sync with extra lag added when at max refresh.

Regular sync is always on regardless if your fps drops below monitor refresh causing very bad stuttering every time.
 
PC to console comparisons seem to be very very game specific. Anyone comparing consoles to PC video cards should always add for this game specifically to the end. Lesson I've learned. You all should to. Be ready for consoles to hit high and low against PC. Some games are 1070 level. Some are 2080 just depends on the game.
 

PaintTinJr

Member
It is a tech that Nvidia spent time making and silicon implementing, it's a core part of the architecture and is unfair to ignore it specially when you have potentially better IQ as a result.

Most likely you could crank everything to ultra and still have better performance with DLSS, let's not ignore that the console version have Dynamic 4k and a key element, alphas that hit hard the memory bandwidth at a lower setting than the lowest on PC, misleading is to say PS5 is in the same ballpark which is clearly not.
Deep Learn Super Sampling - with pre-analysis to build an upscaling AI model - isn't an equivalence to dynamic resolution scaling.

AFAIK, the PS5 will continue to render the depth buffer(zbuffer) at a static full target resolution - or close enough to still be a small amount of incoherency between crosshair (depth buffer picking resolution) and the rasterized dynamic resolution. Whereas DLSS will have to render the depth buffer at the lowest PC setting, too, meaning that even if it looks almost like native resolution at full frame-rate, the collision system will still be at whatever the lowest resolution for DLSS is being used, so the gameplay interaction experience is still of a lower resolution- from what I make out from the Wiki info on DLSS.
 

PaintTinJr

Member
V-sync in double buffered mode stalls the rendering pipeline at the point when it flips the backbuffer ( the one being renderer into by the graphics pipeline) to the front buffer (the one being displayed at a refresh)

If you can't complete the frame rasterization in the allocated time-slice (1/frame-rate) then you drop frames - waiting to flip - until the frame rasterization completes. Alternatively, if you complete the rendering of the backbuffer early, then you are blocking the rendering pipeline until the flip/refresh, so v-sync can cost you performance unless you have really well paced frames that are just under the time-slice for rendering.

Triple buffering (with or without vsync) lets you complete 1 frame buffer early and start work on the next before the first one is flipped. When the flip/refresh comes, a complete frame is rendered on time and you can return to the partially completed frame without any loss in work done or tearing - but at the cost of 1 added frame of input latency.
....
For anyone wondering why double buffering v-sync costs performance compared to v-sync off, this is a reasonably good explanation of what is typically happening.
 

Shmunter

Member
Yes

If your framerate has a max of 60fps but the hardware renders it at 80 for a little while... It would not count it.

Also, if the hame drops to 45fps it would count as 30fps if vsync is enabled.

You can't compare hardware performance if you don't measure systems in the same manner.

However pointing out different options and measuring their impact seems fair to me (just don't say that it's telling us which hardware performs "better").
Also if you average a series of vsync frames that fluctuate between 60 and 30 you average at 45. Aka, the PS5 readout.
 
Also if you average a series of vsync frames that fluctuate between 60 and 30 you average at 45. Aka, the PS5 readout.
Only if it's exactly half the frames at 30 then the other half at 60... it's not going to give you the same average as fluctuating from let's say 37 to 86.
 
Top Bottom