• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGNxgame: Uncharted: Legacy of Thieves Collection PC vs. PS5 Performance Review

Check out how each CPU core is loaded during those scenes. I wouldn't be surprised if one or two cores are 100 percent utilised and others are not utilised adequately. DF noted this issues as what explained the notably high loading times.

Clearly not a lot of work has gone into porting this PS5 engine to the PC. Rather poor level effort by Iron Galaxy.
Each times the PS5 punches above its weight the comparison is almost always stated as invalid for X or Y reason. So convenient. And the fact that they checked the CPU usage when loading in order to damage control the fact that PS5 has quicker loading times than PC is hilarious. They really can't bear the fact that this plastic box is in any way better than their own favorite plastic box.
 

SlimySnake

Flashless at the Golden Globes
There is something seriously not right with the CPU performance on the PC version. My desktop Zen 2 (Ryzen 7 3700X) CPU falls way behind PS5 in these CPU-heavy sections. I'm not sure if it has to do with the hardware, driver, or the DX12 API inefficiency here or if PS5 HW is simply punching above its weight.

Most settings are broadly equivalent to PS5 (i.e High). The only exception is Model Quality which is set to the lower "Standard" quality because... Well, "Enhanced" has even more frame-rate drops and impacts the CPU.

SlimySnake SlimySnake yamaci17 yamaci17
1080p?

I had similar issues with HZD at higher framerates. For whatever reason, PS4 games dont scale linearly when it comes to higher framerates.

What are your results like in native 4k?
 

01011001

Banned
Each times the PS5 punches above its weight the comparison is almost always stated as invalid for X or Y reason. So convenient. And the fact that they checked the CPU usage when loading in order to damage control the fact that PS5 has quicker loading times than PC is hilarious. They really can't bear the fact that this plastic box is in any way better than their own favorite plastic box.

"oh no logic and facts are used to explain technology"

it's almost like THEY ARE CORRECT AND IT'S EASY TO PROVE THAT THEY ARE... fucking wild I know!

using 2 threads to load data and not utilising the rest of the CPU was proven and explains easily why it loads slowly.
what do you think they should do? lie? say that it loads slow because of unknown reasons? say it loads slow because obviously the PS5 is plastic box jesus and has magic powers?
 
Last edited:
Each times the PS5 punches above its weight the comparison is almost always stated as invalid for X or Y reason. So convenient. And the fact that they checked the CPU usage when loading in order to damage control the fact that PS5 has quicker loading times than PC is hilarious. They really can't bear the fact that this plastic box is in any way better than their own favorite plastic box.
Not when the reasons are fully illustrated with facts and figures lol. Evidence based conclusions should be easy enough to understand...and digest.
 

Md Ray

Member
1080p?

I had similar issues with HZD at higher framerates. For whatever reason, PS4 games dont scale linearly when it comes to higher framerates.

What are your results like in native 4k?
Yes, 1080p. Same results even at lower resolutions than that.

Didn't test native 4K very much, but from what I saw it was barely averaging ~46fps. Areas where PS5 holds consistent 40fps, dips slightly under 40fps on my PC.
 

SlimySnake

Flashless at the Golden Globes
Yes, 1080p. Same results even at lower resolutions than that.

Didn't test native 4K very much, but from what I saw it was barely averaging ~46fps. Areas where PS5 holds consistent 40fps, dips slightly under 40fps on my PC.
Alex’s video had him comparing the ps5 to the fucking 2060 super. Its not even an RT game. I wonder what’s going on with his video.
 

Md Ray

Member
Alex’s video had him comparing the ps5 to the fucking 2060 super. Its not even an RT game. I wonder what’s going on with his video.
He should have done the usual CPU comparison as well with his R5 3600 in Madagascar chase sequence. Wish I had an RDNA 2 GPU to compare with my 3700X, maybe it's the notorious NVIDIA's DX12 driver overhead problem rearing its ugly head here.

A desktop 8-core Zen 2, IMO, should not be performing like this.

EDIT: I've also noticed that Uncharted very rarely utilizes full power of this GPU. By that I mean the usual power consumption when the GPU is at 100% is kinda low in UC compared to, say, A Plague Tale:R which consumes 240-244 watts at 100% load. UC tops out at 198-202 watts max at 100% load.
 
Last edited:

yamaci17

Member
He should have done the usual CPU comparison as well with his R5 3600 in Madagascar chase sequence. Wish I had an RDNA 2 GPU to compare with my 3700X, maybe it's the notorious NVIDIA's DX12 driver overhead problem rearing its ugly head here.

A desktop 8-core Zen 2, IMO, should not be performing like this.

EDIT: I've also noticed that Uncharted very rarely utilizes full power of this GPU. By that I mean the usual power consumption when the GPU is at 100% is kinda low in UC compared to, say, A Plague Tale:R which consumes 240-244 watts at 100% load. UC tops out at 198-202 watts max at 100% load.
Not every game can fully load all shaders a GPU has. It is similar to how games interact with CPUs, sometimes there's a limit to distrubiting to load across a lot of cores and SMs

I'm not surprised 3700x being far behind on PS5. It actually happens in Spiderman and many other games. As I've said again,

- Drawcalls are practically free on console architectures, meanwhile drawcalls can suck up serious amount of CPU power on desktop. This alone can create enormous performance dispereancies between platforms. This is why a CPU that is 5 times faster than 1.6 GHz Jaguar driven games usually outperforms it by 2.5-3x on desktop.
- PS5/4 API is most likely the most CPU efficient API out there, beating the Xbox SX even (there are a myriad of games that runs better on PS5 in CPU bound cases. Clearly, PS5's API is much more efficient than Xbox's DX12.
- Certain other API overheads
- NV overhead

All this and you should be grateful for the performance you have. I expect far worse results deeper into the generation.

This situation has not been exposed in many cases so far because you could easily get GPU bound alongside with PS5. PS5 mostly locks to a 60 and pushes higher resolution.
 
Last edited:
Alex’s video had him comparing the ps5 to the fucking 2060 super. Its not even an RT game. I wonder what’s going on with his video.
As usual with most PS5 vs PC comparisons they are using any ways they can to twist the comparisons in order to make PS5 look bad by:

- Using cherry picked short scene or frame instead of bigger scene average.
- Using a high end 5Ghz CPU + 2060 super in CPU limited scenes and claiming they are comparing against PS5 GPU only (and not the CPU + GPU + API combination like for instance NXGamer righthfully says).
- When PS5 still beats say a 3070 they simply refuse to compare them by claiming x or y reason (which don't prevent them to make comparisons in plenty others analysis).
- And finally when comparison is possible they refuse to compare PC against PS5 uncapped when PS5 has a uncapped VRR mode (like in Spider-man).

This has being the case in almost all Alex PC vs PS5 comparisons to date notably Death Stranding, God of War and all the Uncharted remasters.
 

winjer

Gold Member
- Drawcalls are practically free on console architectures, meanwhile drawcalls can suck up serious amount of CPU power on desktop. This alone can create enormous performance dispereancies between platforms. This is why a CPU that is 5 times faster than 1.6 GHz Jaguar driven games usually outperforms it by 2.5-3x on desktop.

Draw calls only become a big issue on PC with badly optimized games.
The PC has low level APIs that can process a lot of draw calls, as not to become a bottleneck.
But even on DX11, draw calls are no as big an issue. If a developer uses command lists, they can get a lot of draw calls being processed.
 

winjer

Gold Member
As usual with most PS5 vs PC comparisons they are using any ways they can to twist the comparisons in order to make PS5 look bad by:

- Using cherry picked short scene or frame instead of bigger scene average.
- Using a high end 5Ghz CPU + 2060 super in CPU limited scenes and claiming they are comparing against PS5 GPU only (and not the CPU + GPU + API combination like for instance NXGamer righthfully says).
- When PS5 still beats say a 3070 they simply refuse to compare them by claiming x or y reason (which don't prevent them to make comparisons in plenty others analysis).
- And finally when comparison is possible they refuse to compare PC against PS5 uncapped when PS5 has a uncapped VRR mode (like in Spider-man).

This has being the case in almost all Alex PC vs PS5 comparisons to date notably Death Stranding, God of War and all the Uncharted remasters.

Neither DF nor NXGamer are good ways to compare PS5 to PC.
DF used a 12900K, which is a vastly superior CPU to what is inside the PS5. And NXGamer used a broken PC that vastly underperforms.

You might complain that people refuse to accept NXGamers results, but the fact remains that several users with similar PCs have compared his performance, and noticed a huge discrepancy in results.
This is not margin of error stuff. It's a huge gap that completely invalidates his PC results.
That assessment that a 3070 is equal to the PS5, is complete non-sense, stemmed from the simple fact that you are using results from a very bad review.
 

yamaci17

Member
Draw calls only become a big issue on PC with badly optimized games.
The PC has low level APIs that can process a lot of draw calls, as not to become a bottleneck.
But even on DX11, draw calls are no as big an issue. If a developer uses command lists, they can get a lot of draw calls being processed.
Low level APIs only solved the limiting number of drawcalls. Drawcalls themselves still have a huge cost on CPU.


"The CPU of the PlayStation® 4 is an AMD Jaguar with 8 cores. It is obviously slower than some recently-released PC hardware; but the PlayStation® 4 has some major advantages, such as very fast access to the hardware. We find the PlayStation® 4 graphics API to be much more efficient than all PC APIs. It is very direct and has very low overhead. This means we can push a lot of draw calls per frame. We knew that the high number of draw calls could be an issue with low-end PCs."
 

winjer

Gold Member
Low level APIs only solved the limiting number of drawcalls. Drawcalls themselves still have a huge cost on CPU.


"The CPU of the PlayStation® 4 is an AMD Jaguar with 8 cores. It is obviously slower than some recently-released PC hardware; but the PlayStation® 4 has some major advantages, such as very fast access to the hardware. We find the PlayStation® 4 graphics API to be much more efficient than all PC APIs. It is very direct and has very low overhead. This means we can push a lot of draw calls per frame. We knew that the high number of draw calls could be an issue with low-end PCs."

Yes, console APIs go to a lower level than on PC and have a lower overhead.
But the issue of dar calls on PC has been massively overblown.
A 3070 will not perform like a PS5, just because of API overhead. Only if it's a very badly optimized game. Or a broken PC.

 
Last edited:

winjer

Gold Member
I'm talking about the CPU performance, not the GPU

I understand what you mean that APIs on consoles have a lower overhead. But even this overhead has been overstated.
The biggest issue with PC performance in some games is lack of optimization.
But it's not like the CPU on consoles don't have some issues. Lack of cache and very high memory latency are some of the most pressing.
 
Last edited:
That's still leaving things up for guesswork, which defeats the purpose of a benchmark. And this is where NXG is lacking, because his 2700x system seems to be benching lower than similar equipped systems consistently. That should be his starting point. His haphazard testing methodologies is why most people take his videos with a grain of salt.

Absolutely like-for-like testing environment and consistency is key for benchmarking. Did you check out the watch dogs legion DF comparison I mentioned earlier? What are your thoughts on it?
My point is if the ps5 still outperforms the card even with that lower setting this the ps5 is faster by whatever that percentage is at minumum and could theoretically be higher with a perfectly matched setting
 
That's still leaving things up for guesswork, which defeats the purpose of a benchmark. And this is where NXG is lacking, because his 2700x system seems to be benching lower than similar equipped systems consistently. That should be his starting point. His haphazard testing methodologies is why most people take his videos with a grain of salt.

Absolutely like-for-like testing environment and consistency is key for benchmarking. Did you check out the watch dogs legion DF comparison I mentioned earlier? What are your thoughts on it?
Their watch dogs legion comparison would be fine the problem is since the frames arent uncapped we don’t know the full story
There is something seriously not right with the CPU performance on the PC version. My desktop Zen 2 (Ryzen 7 3700X) CPU falls way behind PS5 in these CPU-heavy sections. I'm not sure if it has to do with the hardware, driver, or the DX12 API inefficiency here or if PS5 HW is simply punching above its weight.

Most settings are broadly equivalent to PS5 (i.e High). The only exception is Model Quality which is set to the lower "Standard" quality because... Well, "Enhanced" has even more frame-rate drops and impacts the CPU.

Screenshot-7.png

vlcsnap-2022-10-23-02h11m37s747.png


Screenshot-8.png

vlcsnap-2022-10-23-02h18m35s520.png


Screenshot-9.png

vlcsnap-2022-10-23-02h24m11s322.png


Screenshot-11.png

vlcsnap-2022-10-23-02h32m40s081.png


Screenshot-13.png

vlcsnap-2022-10-23-02h35m42s318.png


Screenshot-14.png

vlcsnap-2022-10-23-02h38m38s898.png


Screenshot-18.png

vlcsnap-2022-10-23-02h52m15s744.png


Screenshot-19.png

vlcsnap-2022-10-23-02h59m27s961.png

SlimySnake SlimySnake yamaci17 yamaci17
Thank you for actually matching the cpu as close as possible for your bench and yeah there is something wrong there
 

rofif

Can’t Git Gud
There is something seriously not right with the CPU performance on the PC version. My desktop Zen 2 (Ryzen 7 3700X) CPU falls way behind PS5 in these CPU-heavy sections. I'm not sure if it has to do with the hardware, driver, or the DX12 API inefficiency here or if PS5 HW is simply punching above its weight.

Most settings are broadly equivalent to PS5 (i.e High). The only exception is Model Quality which is set to the lower "Standard" quality because... Well, "Enhanced" has even more frame-rate drops and impacts the CPU.

Screenshot-7.png

vlcsnap-2022-10-23-02h11m37s747.png


Screenshot-8.png

vlcsnap-2022-10-23-02h18m35s520.png


Screenshot-9.png

vlcsnap-2022-10-23-02h24m11s322.png


Screenshot-11.png

vlcsnap-2022-10-23-02h32m40s081.png


Screenshot-13.png

vlcsnap-2022-10-23-02h35m42s318.png


Screenshot-14.png

vlcsnap-2022-10-23-02h38m38s898.png


Screenshot-18.png

vlcsnap-2022-10-23-02h52m15s744.png


Screenshot-19.png

vlcsnap-2022-10-23-02h59m27s961.png

SlimySnake SlimySnake yamaci17 yamaci17
Interesting.
ps5 should have similar cpu to 3700x. That's why I got hat cpu, to have parity for upcoming gen (3700x came out before ps5).
If you lower resolution, you are still cpu limited?
 

SlimySnake

Flashless at the Golden Globes
Found a 6600xt benchmark. 1440p only but it hovers around 60 fps.



Do we know the average fps in lost legacy at 1440p? EDIT: Just ran through the same level on my LGCX and the framerate was 90-100 fps with 90 being average and some rare drops to 85 fps.

EDIT #2: Didnt realize he paired it with the 5900x. Thats a 12 core 24 thread CPU hitting 4.8 Ghz. And yet the PS5 is outperforming it by 50%. Crazy.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Interesting.
ps5 should have similar cpu to 3700x. That's why I got hat cpu, to have parity for upcoming gen (3700x came out before ps5).
If you lower resolution, you are still cpu limited?
3700x goes up to 4.4 GHz and has 4x more cache than the PS5 CPU. Id say the PS5 CPU is roughly around the 2700x if not a bit worse.

The PS5 is also reserving 2 cores for the stupid OS. Shouldve been one core two threads max.
 
3700x goes up to 4.4 GHz and has 4x more cache than the PS5 CPU. Id say the PS5 CPU is roughly around the 2700x if not a bit worse.

The PS5 is also reserving 2 cores for the stupid OS. Shouldve been one core two threads max.
That just makes the 3700x look worse then I don’t think this is the point you think it is
 
Neither DF nor NXGamer are good ways to compare PS5 to PC.
DF used a 12900K, which is a vastly superior CPU to what is inside the PS5. And NXGamer used a broken PC that vastly underperforms.

You might complain that people refuse to accept NXGamers results, but the fact remains that several users with similar PCs have compared his performance, and noticed a huge discrepancy in results.
This is not margin of error stuff. It's a huge gap that completely invalidates his PC results.
That assessment that a 3070 is equal to the PS5, is complete non-sense, stemmed from the simple fact that you are using results from a very bad review.
I don’t think anything is wrong with his pc he Jaír uses a cpu that’s somewhat worse than the ps5 one
 

SlimySnake

Flashless at the Golden Globes
That just makes the 3700x look worse then I don’t think this is the point you think it is
In this particular game, we dont know whats holding back the 3700x. But we know from the PS5 repurposed chips benchmarks that the PS5 CPU (4700S) performs worse than the 2700 NX Gamer uses.

xg3E3lL.png


vaIeCNR.png



Go to the thread where he reviews the City Demo UE5.
There you will see the performance on his PC is off, by a huge margin.
Yeah, I used to think that the 2700x was the issue, but after the UE5 thread, im not so sure. I wouldnt be surprised if he has XMP disabled lol.
 

01011001

Banned
Each times the PS5 punches above its weight the comparison is almost always stated as invalid for X or Y reason. So convenient. And the fact that they checked the CPU usage when loading in order to damage control the fact that PS5 has quicker loading times than PC is hilarious. They really can't bear the fact that this plastic box is in any way better than their own favorite plastic box.

As usual with most PS5 vs PC comparisons they are using any ways they can to twist the comparisons in order to make PS5 look bad

you are the whole clown show aren't you?

clowns GIF by The 90th Macy’s Thanksgiving Day Parade
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Interesting. 2080 Super comparison has PS5 ahead in similar scenarios. This guy is using a i7-9700k which is 4.9 GHz, but only 8 core and 8 threads and my PS5 is roughly 15-20 fps ahead. Figured the CPU was bottlenecking the 2080 Super, but then I saw another 3070 Ti bottleneck, and the PS5 is roughly the same performance at 1440p High Settings despite being paired up with a 5900x.

Pretty impressive showing for the PS5 here. Either that, or this game is really poorly optimized for PC lol.





NX Gamer's results at least in this game might not be that far off if those 2080 Super and 3070 Ti benchmarks are an indication. I found a 3080 1440p benchmark but need to run through that level on my PS5 to see how it fares. I am unable to find any 2070 super benchmarks on youtube that arent using DLSS.
 

ACESHIGH

Banned
The game is a shit tier port, there's no way around it. That's the main issue here. But hacks like Digital foundry see a graphic options menu with a lot of tabs and declare on 10 seconds that the port is great. Same with most positive steam reviews that basically are "now bring Bloodborne to PC" with 3 hours of gameplay.

Naughty dog themselves have to step in and fix this so that the PC conversion of the engine is in a good state by the time Part 1 and factions are ported. If they want to get in the PC F2P space, factions has to be scalable and optimized down to a T, pumping out high FPS in mainstream configs.
 
Last edited:

squarealex

Member
There is something seriously not right with the CPU performance on the PC version. My desktop Zen 2 (Ryzen 7 3700X) CPU falls way behind PS5 in these CPU-heavy sections. I'm not sure if it has to do with the hardware, driver, or the DX12 API inefficiency here or if PS5 HW is simply punching above its weight.

Most settings are broadly equivalent to PS5 (i.e High). The only exception is Model Quality which is set to the lower "Standard" quality because... Well, "Enhanced" has even more frame-rate drops and impacts the CPU.

Screenshot-7.png

vlcsnap-2022-10-23-02h11m37s747.png


Screenshot-8.png

vlcsnap-2022-10-23-02h18m35s520.png


Screenshot-9.png

vlcsnap-2022-10-23-02h24m11s322.png


Screenshot-11.png

vlcsnap-2022-10-23-02h32m40s081.png


Screenshot-13.png

vlcsnap-2022-10-23-02h35m42s318.png


Screenshot-14.png

vlcsnap-2022-10-23-02h38m38s898.png


Screenshot-18.png

vlcsnap-2022-10-23-02h52m15s744.png


Screenshot-19.png

vlcsnap-2022-10-23-02h59m27s961.png

SlimySnake SlimySnake yamaci17 yamaci17
The problem is and always is the DX12 API... you lost some perf than DX11...
 
Interesting. 2080 Super comparison has PS5 ahead in similar scenarios. This guy is using a i7-9700k which is 4.9 GHz, but only 8 core and 8 threads and my PS5 is roughly 15-20 fps ahead. Figured the CPU was bottlenecking the 2080 Super, but then I saw another 3070 Ti bottleneck, and the PS5 is roughly the same performance at 1440p High Settings despite being paired up with a 5900x.

Pretty impressive showing for the PS5 here. Either that, or this game is really poorly optimized for PC lol.





NX Gamer's results at least in this game might not be that far off if those 2080 Super and 3070 Ti benchmarks are an indication. I found a 3080 1440p benchmark but need to run through that level on my PS5 to see how it fares. I am unable to find any 2070 super benchmarks on youtube that arent using DLSS.

This isn’t surprising the ps5 at points was performing like a 3070 ti in Spiderman
 

01011001

Banned
This isn’t surprising the ps5 at points was performing like a 3070 ti in Spiderman

do we have updated tests on this? because the port was clearly not using VRAM correctly on launch and ran like shit on any 8GB card. they apparently patched this now
 
Last edited:
do we have updated tests on this? because the port was clearly not using VRAM correctly on launch and ran like shit on any 8GB card. they apparently patched this now
I think the vram usage was helping with stutters that occured at points not general performance at least not by much
 

SlimySnake

Flashless at the Golden Globes
Did a couple more comparisons. Man the 3060 Ti (2080 equivalent) does not have a good showing in Lost Legacy. Struggles to stay at 60 fps while the same level on my PS5 was consistently around 100-110 fps. Even hitting 118 fps at one point during the checkpoint cutscene. Never seen performance this bad on a nvidia card.

The 13 tflops 6700xt below fares much better. roughly around 10-15 fps behind in this level. though the level is not a good comparison because the background is an empty ocean which pretty much acts like the sky where the framerate jumps up to 120 fps at 1440p. still, the ps5 was around 95-105 when swinging around in this level.





This game is clearly favoring AMD GPUs just like we saw Death Stranding, and its definitely not well optimized on PC. Regardless, this just goes to show that the PS5 IO/coding to the metal/drivers is heavily utilized by Sony studios and why exclusives matter. We've seen how third party games pretty much have the PS5 perform more or less the same as a 2080 or a 6600xt in most games. We're seeing Plague's Tale right now performing like dog shit on the PS5, and yet this game, GOW, Death Stranding and Spiderman just outperform those cards with little to no effort.
 

Whitecrow

Banned
Did a couple more comparisons. Man the 3060 Ti (2080 equivalent) does not have a good showing in Lost Legacy. Struggles to stay at 60 fps while the same level on my PS5 was consistently around 100-110 fps. Even hitting 118 fps at one point during the checkpoint cutscene. Never seen performance this bad on a nvidia card.

The 13 tflops 6700xt below fares much better. roughly around 10-15 fps behind in this level. though the level is not a good comparison because the background is an empty ocean which pretty much acts like the sky where the framerate jumps up to 120 fps at 1440p. still, the ps5 was around 95-105 when swinging around in this level.





This game is clearly favoring AMD GPUs just like we saw Death Stranding, and its definitely not well optimized on PC. Regardless, this just goes to show that the PS5 IO/coding to the metal/drivers is heavily utilized by Sony studios and why exclusives matter. We've seen how third party games pretty much have the PS5 perform more or less the same as a 2080 or a 6600xt in most games. We're seeing Plague's Tale right now performing like dog shit on the PS5, and yet this game, GOW, Death Stranding and Spiderman just outperform those cards with little to no effort.

TBH PT:Requiem tech trumps all cross-gen PS titles.
 

THE DUCK

voted poster of the decade by bots
One of best games ever gets a PC release !
Nice analysis @NXGamer


Seems like a nice, straightforward update.
Ultra settings very close to ps5 aside from shadows which are a bit sharper on pc.
No Motion blur on pc. It's broken and option does nothing
Some small texture bugs and LOD bugs (shorter than ps5)
Shader compilation on CPU like in Spider-Man

RTX2070 (theoretical similar ps5 gpu) might drop to 20fps at 4k ultra due to memory limitations
About up to 40 on high. Generally at 4k ps5 is 40% faster on same settings with rtx 2070 vs ps5.

looks like a good port to me! Shame Dualsense features dont work wirelessly.
I have not found any info on 3d audio
edit: Loading 2-3 seconds on ps5. 7-8 seconds on pc
edit2: Maybe no HDR... oof. Ok, looks like a bit shitty port


Wait, it never occurred to me that Dualsense doesn't work wirelessly, if Sony is going to support PC, why don't they release a wireless adaptor?
Hopefully it's in the works for all of you playing Sony games on PC.
 
Interesting. 2080 Super comparison has PS5 ahead in similar scenarios. This guy is using a i7-9700k which is 4.9 GHz, but only 8 core and 8 threads and my PS5 is roughly 15-20 fps ahead. Figured the CPU was bottlenecking the 2080 Super, but then I saw another 3070 Ti bottleneck, and the PS5 is roughly the same performance at 1440p High Settings despite being paired up with a 5900x.

Pretty impressive showing for the PS5 here. Either that, or this game is really poorly optimized for PC lol.





NX Gamer's results at least in this game might not be that far off if those 2080 Super and 3070 Ti benchmarks are an indication. I found a 3080 1440p benchmark but need to run through that level on my PS5 to see how it fares. I am unable to find any 2070 super benchmarks on youtube that arent using DLSS.

No surprises here. From the start NX gamers results are the most realistic when compared to others normal PC players. 2080 super, 3070 and 3070ti in some scenes is the level of performance of PS5 in many games since one of the first PS5 version of COD (when the game is correctly optimized on that machine obviously by a competent developer that already has experience coding on Playstation) .

So no surprises here. By the way the difference of settings between PS5 and PC is incredibly small in this game, like you need 400% zoom to see the differences (that should not make a few more % of performance difference in either way).
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Wait, it never occurred to me that Dualsense doesn't work wirelessly, if Sony is going to support PC, why don't they release a wireless adaptor?
Hopefully it's in the works for all of you playing Sony games on PC.
It works wirelessly, it just doesnt send correct haptics and controller audio without some hoop jumping. (Just install DualSenseX and it works near perfectly)
Gyro and Touchpad work with SteamInput.
Haptics ironically work with Switch Games.
Haptics + Triggers require disabling SteamInput and only work in PC native games while wired.
(atleast in every game ive tested)

Im guessing its partly due to the DualSense SDK still needing some work.
Cuz you can get practically the full feature set wirelessly using DualSenseX, games dont really use Gyro so you arent really missing out.


<---- Plays like 1 foot from my monitor so wired or not doesnt actually matter, but If i was couching the situation it would be nice to get the full feature set, natively without needing to run DualSenseX.
 

winjer

Gold Member
In this particular game, we dont know whats holding back the 3700x. But we know from the PS5 repurposed chips benchmarks that the PS5 CPU (4700S) performs worse than the 2700 NX Gamer uses.

Yeah, I used to think that the 2700x was the issue, but after the UE5 thread, im not so sure. I wouldnt be surprised if he has XMP disabled lol.

XMP can boost performance sometimes. Depending on the kit.
Weirdly, some kits have higher timings when in XMP mode, than in standard mode.

But even with kits that have decent XMP settings, it wound's make that big difference, as we saw in the UE5 thread.
And it woudn't affect the memory usage. There is something very wrong with his PC that goes beyond XMP.
 

rofif

Can’t Git Gud
Wait, it never occurred to me that Dualsense doesn't work wirelessly, if Sony is going to support PC, why don't they release a wireless adaptor?
Hopefully it's in the works for all of you playing Sony games on PC.
It works wirelessly but the haptics and triggers functions do not work wirelessly
 

ACESHIGH

Banned
Have they even started they are working on patching this game? What a shit show.

Hopefully the PC version of the last of us is handled by a more competent developer. Let's see how Sackboy turns out this week.
 

Gaiff

Gold Member
No surprises here. From the start NX gamers results are the most realistic when compared to others normal PC players. 2080 super, 3070 and 3070ti in some scenes is the level of performance of PS5 in many games since one of the first PS5 version of COD (when the game is correctly optimized on that machine obviously by a competent developer that already has experience coding on Playstation) .

So no surprises here. By the way the difference of settings between PS5 and PC is incredibly small in this game, like you need 400% zoom to see the differences (that should not make a few more % of performance difference in either way).
That first PS5 version of COD has alpha effects on consoles reduced to 1/4 resolution compared to the PC version which is much easier on the ROPs and overall bandwidth. Furthermore, the latest GeForce patch also provided some strong performance improvements in some games.

In COD Warzone for instance, fps improved by as much as 44% in some games.



This isn’t surprising the ps5 at points was performing like a 3070 ti in Spiderman
The 2080 Ti also outperforms the 3070 Ti and 3070 in some instances in Spider-Man. Some guys on Beyond3D also posted screenshots where the 2080 Ti outperformed the PS5 by 40%+ in some scenes.

There was something strange going on with the VRAM and I'm guessing the BVH structure being extremely heavy on the CPU as stated by Nixxes has something to do with the performance inconsistencies.

PCs and PS5 have different configurations so different bottlenecks will occur. PS5 seems to be doing excellently during rapid streaming but is ostensibly still bandwidth constrained with even in-house devs still opting for lower AF when 16x AF has been free on PC for years. Those 1 to 1 comparisons are often flawed because as I said, different scenes will hit different areas differently (whew, that's a lot of different). It's far more complicated than just PS5 GPU>2070>2080S etc. The PS5 is the sum of all its parts, not just the GPU that can be isolated.
 
Last edited:

ScHlAuChi

Member
Naughty dog themselves have to step in and fix this so that the PC conversion of the engine is in a good state by the time Part 1 and factions are ported. If they want to get in the PC F2P space, factions has to be scalable and optimized down to a T, pumping out high FPS in mainstream configs.
You clearly have no idea at all how game development works!
Naughty Dog´s programmers are focused working on whatever their current codebase is.
Going back to an old codebase that no one has touched for years means you would have to re-learn everything about that codebase.
No developer is ever going to do that when they have bigger fish to fry - hence why ports are outsourced!
 

SlimySnake

Flashless at the Golden Globes
XMP can boost performance sometimes. Depending on the kit.
Weirdly, some kits have higher timings when in XMP mode, than in standard mode.

But even with kits that have decent XMP settings, it wound's make that big difference, as we saw in the UE5 thread.
And it woudn't affect the memory usage. There is something very wrong with his PC that goes beyond XMP.
Yeah, but his 2070 Super results match other 2070 Super benchmarks ive seen on youtube.

Look at the 3070 here. Performing worse than the PS5 at 1440p. The ultra settings might be the reason why, but Alex said the PS5 has better LOD overall in the DF direct yesterday.



The game just runs poorly on Nvidia hardware. the 3070 is 26% more powerful than the 2080 which is roughly on par with the PS5 and XSX in some games. It should not be performing worse than the PS5 here. Im guessing ND made heavy use of async compute on the PS4, and thats why it performs better on AMD hardware. The PS5 is still outperforming a 13 tflops 6700xt so thats probably where the poor optimization part comes in, but NX gamer's results might not be inaccurate in this instance.
 
Last edited:

winjer

Gold Member
Yeah, but his 2070 Super results match other 2070 Super benchmarks ive seen on youtube.

Look at the 3070 here. Performing worse than the PS5 at 1440p. The ultra settings might be the reason why, but Alex said the PS5 has better LOD overall in the DF direct yesterday.



The game just runs poorly on Nvidia hardware. the 3070 is 26% more powerful than the 2080 which is roughly on par with the PS5 and XSX in some games. It should not be performing worse than the PS5 here. Im guessing ND made heavy use of async compute on the PS4, and thats why it performs better on AMD hardware. The PS5 is still outperforming a 13 tflops 6700xt so thats probably where the poor optimization part comes in, but NX gamer's results might not be inaccurate in this instance.


No doubt that the Uncharted 4 port is a bad one. Making the comparison more pointless.
But previous tests showed NXGamers PC results were very flawed.
So unless he has fixed his PC, his current results continue to be invalid.

Although previous generations of nvidia architectures had a very lacking support for a-sync compute, Turing and newer GPUs have a very good async support. So that is not it.
Could be drivers, or some other matter of architecture differences between nvidia and AMD.
 

SlimySnake

Flashless at the Golden Globes
Have they even started they are working on patching this game? What a shit show.

Hopefully the PC version of the last of us is handled by a more competent developer. Let's see how Sackboy turns out this week.
The game was developed on console. It's a port. It was never going to perform like third party PC games that are developed on PC first then downported to consoles.

We saw this with the GOW port. The porting studio took 2 years to port the game and they admitted that to completely change the CPU logic wouldve taken even more time. We might be seeing something similar here.

TBH, I dont think its a bad thing. If I was a PC tech reviewer, i wouldnt call these great ports, but expecting these console games to perform like PC games is a bit unfair. They were coded around the PS4 hardware and one cant be expected to rearchitect the entire engine for a simple port. After all, we have seen plenty of games underperform on PS5. Hitman for example. Plague's Tale recently. They were downported to the PS5 and it shows. Does that mean they are shitty devs? Not really. It's just the nature of video game devs. Exclusives are important for this reason. Devs use one SKU to focus on extracting every last bit of performance from that one system whereas multiplats are just meant to run smoothly without many bugs.

Compared to HZD which was a buggy mess, this game, Spiderman and GOW are perfectly playable. They just have bad performance on older GPUs from 2016-2018 which is perfectly acceptable in PC gaming. Minimum specs increase every year. Death Stranding was simultaneously developed on PC and it runs fine on both the PS5 and PC hardware. Scaling like it should, but only because it was announced from day one as a PC game. Sony studios make their games on PS4 devkits.

Games like this should be celebrated tbh. It means the dev went the extra mile and coded to the metal instead of just developing on PC. We all wish Sony devs use the PS5 IO and well, games like this are a perfect example of just how powerful these consoles can be when not being held back by PC multiplats.
 
Last edited:
The game was developed on console. It's a port. It was never going to perform like third party PC games that are developed on PC first then downported to consoles.

We saw this with the GOW port. The porting studio took 2 years to port the game and they admitted that to completely change the CPU logic wouldve taken even more time. We might be seeing something similar here.

TBH, I dont think its a bad thing. If I was a PC tech reviewer, i wouldnt call these great ports, but expecting these console games to perform like PC games is a bit unfair. They were coded around the PS4 hardware and one cant be expected to rearchitect the entire engine for a simple port. After all, we have seen plenty of games underperform on PS5. Hitman for example. Plague's Tale recently. They were downported to the PS5 and it shows. Does that mean they are shitty devs? Not really. It's just the nature of video game devs. Exclusives are important for this reason. Devs use one SKU to focus on extracting every last bit of performance from that one system whereas multiplats are just meant to run smoothly without many bugs.

Compared to HZD which was a buggy mess, this game, Spiderman and GOW are perfectly playable. They just have bad performance on older GPUs from 2016-2018 which is perfectly acceptable in PC gaming. Minimum specs increase every year. Death Stranding was simultaneously developed on PC and it runs fine on both the PS5 and PC hardware. Scaling like it should, but only because it was announced from day one as a PC game. Sony studios make their games on PS4 devkits.

Games like this should be celebrated tbh. It means the dev went the extra mile and coded to the metal instead of just developing on PC. We all wish Sony devs use the PS5 IO and well, games like this are a perfect example of just how powerful these consoles can be when not being held back by PC multiplats.
Death Stranding on PS5 performs on par with a 3060, even 3070 in some scenes. Shouldn't people complain that it was a bad PC version because the game runs so good comparatively on PS5? So the outcome is about the same as with Uncharted actually.
 

Md Ray

Member
Check out how each CPU core is loaded during those scenes. I wouldn't be surprised if one or two cores are 100 percent utilised and others are not utilised adequately. DF noted this issues as what explained the notably high loading times.

Clearly not a lot of work has gone into porting this PS5 engine to the PC. Rather poor level effort by Iron Galaxy.
Good point and I just checked as you asked... 1-2 cores are 100% loaded only during game load, but during gameplay, all 16 threads are almost evenly loaded in a CPU-intensive scene like this.

Check it out:
FPS: 86
vlcsnap-2022-10-26-21h24m19s434.png


Let's see the same scene on PS5:
FPS: 117
Screenshot-27.png

Thank you for actually matching the cpu as close as possible for your bench and yeah there is something wrong there
You're welcome! Yeah, check out the above screenshots.
Not every game can fully load all shaders a GPU has. It is similar to how games interact with CPUs, sometimes there's a limit to distrubiting to load across a lot of cores and SMs

I'm not surprised 3700x being far behind on PS5. It actually happens in Spiderman and many other games. As I've said again,

- Drawcalls are practically free on console architectures, meanwhile drawcalls can suck up serious amount of CPU power on desktop. This alone can create enormous performance dispereancies between platforms. This is why a CPU that is 5 times faster than 1.6 GHz Jaguar driven games usually outperforms it by 2.5-3x on desktop.
- PS5/4 API is most likely the most CPU efficient API out there, beating the Xbox SX even (there are a myriad of games that runs better on PS5 in CPU bound cases. Clearly, PS5's API is much more efficient than Xbox's DX12.
- Certain other API overheads
- NV overhead

All this and you should be grateful for the performance you have. I expect far worse results deeper into the generation.

This situation has not been exposed in many cases so far because you could easily get GPU bound alongside with PS5. PS5 mostly locks to a 60 and pushes higher resolution.
Agreed.
Interesting.
ps5 should have similar cpu to 3700x. That's why I got hat cpu, to have parity for upcoming gen (3700x came out before ps5).
If you lower resolution, you are still cpu limited?
Same, I went with 3700X for the same reason.
Even at 720p, I get the same 80-90fps as I do at 1080p. GPU is so much underutilized.
 
Top Bottom