Naru
Member
I have nothing against motion blur personally, I am actually pretty ok with it but my body disagrees with me.You are wrong. Motion blur MUST be included in games. it just must to make it look more realistic.
I have nothing against motion blur personally, I am actually pretty ok with it but my body disagrees with me.You are wrong. Motion blur MUST be included in games. it just must to make it look more realistic.
By that logic, all comparisons are pointless. How many has Alex done? AC Valhalla, Hitman 3, Call of Duty, Deathloop. He's constantly comparing PS5 ports of third party games that may or may not have been optimized to take full advantage of the PS5 hardware. But he has no problems doing those comparisons. Notice how he doesnt do a FPS comparison in this video? Why? Just because it doesnt fit his narrative that a PS5 is a 2060? No one is going to look at the Spiderman and Uncharted benchmarks to say that the PS5 is equivalent to a 3070, but Alex had no problems claiming the 2060 is equivalent to a PS5 literally days after launch.No doubt that the Uncharted 4 port is a bad one. Making the comparison more pointless.
I guess I am a bit behind on the issues. I know its not performing well as it should, but I wasnt aware of any bugs beyond the motion blur issue Alex pointed out in this video. What else is broken?SlimySnake I am all for devs taking advantage of console hardware.
But when porting something to PC they should do a good job. After all we are still paying full price for these games.
If a console release was as broken as these ports they would fix them on the double.
one thing I give to the studios working on these ports is that eventually their games get patched and left in a somewhat decent state. They had lots of back and forth communication on steam forums regarding issues to manage customers expectations.
Let's hope Sony straights iron galaxy out so that the game is fixed. It's one of their flagship IPs after all and it's selling like dogs hit. Sure as hell the port quality has something to do with it.
Is this on 1440p or 1080p mode?Good point and I just checked as you asked... 1-2 cores are 100% loaded only during game load, but during gameplay, all 16 threads are almost evenly loaded in a CPU-intensive scene like this.
Check it out:
FPS: 86
Let's see the same scene on PS5:
FPS: 117
Not pointless knowing the state of the game and what I should expect before playing and especially, paying.No doubt that the Uncharted 4 port is a bad one. Making the comparison more pointless.
I guess I am a bit behind on the issues. I know its not performing well as it should, but I wasnt aware of any bugs beyond the motion blur issue Alex pointed out in this video. What else is broken?
I doubt its like HZD which had crashes, bugs, and stutters galore that took 6 months to fix. I doubt motion blur would take that long.
Using the same settings as PS5 in Performance+ mode, so 1080p.Is this on 1440p or 1080p mode?
I was purely benching the CPU performance there. I'll do native 4K framerate tests for you, can you link me to NXG's Madagascar level benchmark? Is it on his channel or IGN?Testing at higher resolutions should reduce the CPU overhead somewhat. In my testing, i wasnt able to use native 4k because my LG CX cant correctly show the FPS below 60 fps, but in 1440p where the PS5 tops out at 90 fps, i saw the 3070 stay around 80-85 fps. Like the spot at the very start of Chapter 6 (The auction level). Wish NX Gamer had done more native 4k framerate tests. But you can probably use his madacasgar level benchmarks that show the PS5 around 45-50 fps max since it isnt CPU bound in that scene like it is in his Chase sequence 4k footage.
It’s in the video in the op. Around half way through the video.Using the same settings as PS5 in Performance+ mode, so 1080p.
I was purely benching the CPU performance there. I'll do native 4K framerate tests for you, can you link me to NXG's Madagascar level benchmark? Is it on his channel or IGN?
Nick dropped a video.
2:20 - every cutscene transition is prerendered at 30fps.... OUCH WHAT ?!
It's real time on ps5 I think, so tis means pc IO is really lacking behind if they can't switch so quick
Timestamped:Digital Foundry said:Also the brief interstitial videos that sometimes bridge real-time cutscenes and gameplay playback at a straight 30fps
Alright, I'm on it, boss.It’s in the video in the op. Around half way through the video.
It's the same on PS5. Those brief pre-rendered cutscenes run at 30fps in 40fps mode and likely in other HFR modes as well.
Source:
Timestamped:
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.It’s in the video in the op. Around half way through the video.
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.
I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.
That first PS5 version of COD has alpha effects on consoles reduced to 1/4 resolution compared to the PC version which is much easier on the ROPs and overall bandwidth. Furthermore, the latest GeForce patch also provided some strong performance improvements in some games.
In COD Warzone for instance, fps improved by as much as 44% in some games.
The 2080 Ti also outperforms the 3070 Ti and 3070 in some instances in Spider-Man. Some guys on Beyond3D also posted screenshots where the 2080 Ti outperformed the PS5 by 40%+ in some scenes.
There was something strange going on with the VRAM and I'm guessing the BVH structure being extremely heavy on the CPU as stated by Nixxes has something to do with the performance inconsistencies.
PCs and PS5 have different configurations so different bottlenecks will occur. PS5 seems to be doing excellently during rapid streaming but is ostensibly still bandwidth constrained with even in-house devs still opting for lower AF when 16x AF has been free on PC for years. Those 1 to 1 comparisons are often flawed because as I said, different scenes will hit different areas differently (whew, that's a lot of different). It's far more complicated than just PS5 GPU>2070>2080S etc. The PS5 is the sum of all its parts, not just the GPU that can be isolated.
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.
I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.
EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.
I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.
EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.
I want them to retest death stranding but have the gpus paired with a 3700x instead of the 12900k they initially hadAs usual with most PS5 vs PC comparisons they are using any ways they can to twist the comparisons in order to make PS5 look bad by:
- Using cherry picked short scene or frame instead of bigger scene average.
- Using a high end 5Ghz CPU + 2060 super in CPU limited scenes and claiming they are comparing against PS5 GPU only (and not the CPU + GPU + API combination like for instance NXGamer righthfully says).
- When PS5 still beats say a 3070 they simply refuse to compare them by claiming x or y reason (which don't prevent them to make comparisons in plenty others analysis).
- And finally when comparison is possible they refuse to compare PC against PS5 uncapped when PS5 has a uncapped VRR mode (like in Spider-man).
This has being the case in almost all Alex PC vs PS5 comparisons to date notably Death Stranding, God of War and all the Uncharted remasters.
No, they used equivalent settings and you're there too. Please don't lie. I'm talking specifically about the opening cutscene on the 2080 Ti.The beyond 3d forums users were using better cpus and sometimes even doing things like lowering texture quality, JaĂr quality, or scene density.
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.
I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.
EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.
I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.
EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.
No, they used equivalent settings and you're there too. Please don't lie. I'm talking specifically about the opening cutscene on the 2080 Ti.
Interesting. 2080 Super comparison has PS5 ahead in similar scenarios. This guy is using a i7-9700k which is 4.9 GHz, but only 8 core and 8 threads and my PS5 is roughly 15-20 fps ahead. Figured the CPU was bottlenecking the 2080 Super, but then I saw another 3070 Ti bottleneck, and the PS5 is roughly the same performance at 1440p High Settings despite being paired up with a 5900x.
Pretty impressive showing for the PS5 here. Either that, or this game is really poorly optimized for PC lol.
NX Gamer's results at least in this game might not be that far off if those 2080 Super and 3070 Ti benchmarks are an indication. I found a 3080 1440p benchmark but need to run through that level on my PS5 to see how it fares. I am unable to find any 2070 super benchmarks on youtube that arent using DLSS.
Sack boy can’t be benchmarked until they give the ps5 versión vrr support it’s still capped to 60Have they even started they are working on patching this game? What a shit show.
Hopefully the PC version of the last of us is handled by a more competent developer. Let's see how Sackboy turns out this week.
Don't know who is the "they" that you're referring to. The 2080 Ti used equivalent settings and was massively ahead of the PS5.I saw the thread they said some of the settings dont matter
It’s in between a 3070 and 3070 ti in that game when using a 3700x ad the cpuDeath Stranding on PS5 performs on par with a 3060, even 3070 in some scenes. Shouldn't people complain that it was a bad PC version because the game runs so good comparatively on PS5? So the outcome is about the same as with Uncharted actually.
YT compression.Why is it blurry af in fidelity mode?
Likely due to it having more VRAM.Interestingly, my 2080 Ti outperforms your 3070 by 10-15% in those scenes.
What cpu do you have?Interestingly, my 2080 Ti outperforms your 3070 by 10-15% in those scenes.
YT compress less pc footages?YT compression.
I was thinking perhaps that, because the game states it uses 7.5GB at those settings but your 3070 doesn't appear to hit any sort of VRAM limit so it's curious. The lower bound is 7% and the upper bound is 16% in terms of differential in favor of my 2080 Ti.Likely due to it having more VRAM.
9900K so a notch better for gaming. Could be the reason.What cpu do you have?
I can link both the video on his channel tests more of the 4k modeUsing the same settings as PS5 in Performance+ mode, so 1080p.
I was purely benching the CPU performance there. I'll do native 4K framerate tests for you, can you link me to NXG's Madagascar level benchmark? Is it on his channel or IGN?
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.
I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.
EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.
Thank you! I definitely am planning on taking a look at all those games you mentioned and more. Can't wait to test out The Last of Us in particular when it comes out.Hey I just subbed I love the benchmarks and how quick you are to find things will you test last of us, miles, and other games when they come out?
You're welcome! What do you think? A 3700X+3070 PC is falling short in 120fps mode, 4K mode compared to PS5. Does this indicate poor CPU+GPU optimization by Iron Galaxy?Amazing. Thanks man!
Could theoretically be 1-2% over the 3070ti at points even during gameplay wouldn’t shock me seems you need a 3080 or above to consistently equal or be above the ps5 hereNice comparison. So using that very GPU limited settings PS5 (GPU) is basically performing like a 3070 in the cutscene and even a bit better during gameplay with consistantly lower lows FPS on PC (so it's like performing like a... 3070ti during gameplay?).
No wonder DF didn't want to make a direct comparison as it would not have being good for their narrative they are pushing since the beginning! And obviously their high end 5ghz CPU would have not helped their lowly 2060 Super here.
I downloaded NXGamer's PS5 footage, ran through Premiere Pro once again so it's more compressed. PC footage is my own so it's crisp.YT compress less pc footages?
I’m a little surprised no ones first thought isn’t that it’s just capture differencesI downloaded NXGamer's PS5 footage, ran through Premiere Pro once again so it's more compressed. PC footage is my own so it's crisp.
It's falling short in 1440p comparisons I did as well. As for why, I think its simply due to how these games were built on the PS4. Built around PS4's strengths and weaknesses. They might not be properly threading the CPU or even the GPU tasks. 3070 has an insane amount of shader processors compared to the PS5. Even though the GPU utilization is in the 90s, its possible the GPU isnt being fed fast enough. Or the game just prefers AMD cards. But then again NX Gamer's 16 tflops 6800 is outperforming the PS5 by only around 15-20% so the PS5 is definitely outperforming even AMD GPUs.Thank you! I definitely am planning on taking a look at all those games you mentioned and more. Can't wait to test out The Last of Us in particular when it comes out.
You're welcome! What do you think? A 3700X+3070 PC is falling short in 120fps mode, 4K mode compared to PS5. Does this indicate poor CPU+GPU optimization by Iron Galaxy?
Yeah thats a badass CPU. i bet the 5.0 ghz clockspeed is helping with the performance since NX Gamer or was it DF said that the game likes faster clocks. I saw the same thing in the Matrix demo. My i7-11700KF which was kinda shit in terms of thermals and wattage all of a sudden turned into a beast compared to the equivalent AMD CPUs which topped out at 4.4 ghz. The unlocked power and higher clocks really helped me hit higher FPS in those games. A lot of these console games are single threaded and they benefit from having higher clock speeds than more threads and cores.9900K so a notch better for gaming. Could be the reason.
Alex is so sad lmao.No wonder DF didn't want to make a direct comparison as it would not have being good for their narrative they are pushing since the beginning! And obviously their high end 5ghz CPU would have not helped their lowly 2060 Super here.
PS4 secret sauceIt's falling short in 1440p comparisons I did as well. As for why, I think its simply due to how these games were built on the PS4. Built around PS4's strengths and weaknesses. They might not be properly threading the CPU or even the GPU tasks. 3070 has an insane amount of shader processors compared to the PS5. Even though the GPU utilization is in the 90s, its possible the GPU isnt being fed fast enough. Or the game just prefers AMD cards. But then again NX Gamer's 16 tflops 6800 is outperforming the PS5 by only around 15-20% so the PS5 is definitely outperforming even AMD GPUs.
This is not the first time we've seen this. We saw this with GOW, with Spiderman, Death Stranding and to a lesser extent Horizon. I remember pulling up NX Gamer's PS4 Pro 60 fps mode footage, and it was averaging 56 fps with a shit tier jaguar CPU and a 4.2 tflops GPU. BETTER than the 1060 and AMD's 6 tflops 580 both roughly equivalent to each other. This is what happens when you finally see console games ported to PC. We see just how much third party games are held back by not developing on consoles first.
Yeah thats a badass CPU. i bet the 5.0 ghz clockspeed is helping with the performance since NX Gamer or was it DF said that the game likes faster clocks. I saw the same thing in the Matrix demo. My i7-11700KF which was kinda shit in terms of thermals and wattage all of a sudden turned into a beast compared to the equivalent AMD CPUs which topped out at 4.4 ghz. The unlocked power and higher clocks really helped me hit higher FPS in those games. A lot of these console games are single threaded and they benefit from having higher clock speeds than more threads and cores.
Alex is so sad lmao.
The performance on the PS5 is fine, but the performance on PC is inexcusable. This is what annoys me with the PC environment; many developers just bank on powerful hardware bruteforcing through shitty performance. As is said, necessity is the mother of all inventions, and console development is a prime example of that.PS4 secret sauce
But really Sony put their best team on the field in their home stadium. As a PC gamer I wouldn't be upset- tent pole exclusives are few and far these days.
I dont disagree. I made similar points back when GOW launched. However, one must remember that these are simply ports. They were never going to go and retool their engine to support multithreading. GOW porting studio simply admitted it.For Uncharted 4, Iron Galaxy just looked at the most common GPU which is the 1060 and just aimed for that. Except that the 1060 is 2x the power of the PS4 so it shouldn't run similarly. That Naughty Dog did a better job at optimizing I can agree, but when the PS5 is matching a GPU that is theoretically 50% faster, the dev did a shit job at porting.
A 1060 is getting only 30fps at 1080p? Holy Gabe!Except that the 1060 is 2x the power of the PS4 so it shouldn't run similarly.
A 1060 is getting only 30fps at 1080p? Holy Gabe!
No, but it fails to maintain 60 even at 1080p/low settings. The PS4 is pretty much locked to 30fps so there's probably a bit of headroom to go higher but not high enough to reasonably hit the 60fps mark. The 1060 is more than twice as powerful as the OG PS4 and even faster than the PS4 Pro so with a halfway competent CPU, it should laugh at 60fps/1080p/low settings.A 1060 is getting only 30fps at 1080p? Holy Gabe!
Definitely but there are levels to this. The Horizon port manages to maintain 60fps+ easily at 1080p/low. It drops to the low 50's at medium which is PS4 level. Uncharted LL tanks to the 40's at low/1080p on a 1060. Furthermore, the settings in Horizon actually make a difference. Anything above Medium in Uncharted on PC is barely noticeable. The scaling is poor and some effects are still broken.There is also other stuff under the hood that we simply dont know about. Everyone tried to pass off the PS4 as an underpowered PC but thats clearly not the case. It might not be full of secret sauce like the cell but these latest benchmarks prove that the PS4 and now PS5 do have something thats boosting performance or letting the consoles punch above their weight. After all, the PS5 got the same treatment no? Doubt they went in and reengineered everything to take advantage of the PS5 IO and then removed all those changes in the PC version. Pretty sure these ports are based on the PS5 version of the game.
Are there differences other than resolution and framerate between PS4 and PS5 versions?And it's not the PS4 version.
Are there differences other than resolution and framerate between PS4 and PS5 versions?
Damn! And they dare to charge almost full price!No, but it fails to maintain 60 even at 1080p/low settings.
ok, thats pretty bad. I think GOW and HZD are roughly on par since they both have PS5 setting presets and average in the mid 50s at 1080p on a 1060. So U4 being worse is a bit concerning.No, but it fails to maintain 60 even at 1080p/low settings. The PS4 is pretty much locked to 30fps so there's probably a bit of headroom to go higher but not high enough to reasonably hit the 60fps mark. The 1060 is more than twice as powerful as the OG PS4 and even faster than the PS4 Pro so with a halfway competent CPU, it should laugh at 60fps/1080p/low settings.
Definitely but there are levels to this. The Horizon port manages to maintain 60fps+ easily at 1080p/low. It drops to the low 50's at medium which is PS4 level. Uncharted LL tanks to the 40's at low/1080p on a 1060. Furthermore, the settings in Horizon actually make a difference. Anything above Medium in Uncharted on PC is barely noticeable. The scaling is poor and some effects are still broken.
Here's hoping it gets better over time. HZD was also in a rough shape initially.
I dont disagree. I made similar points back when GOW launched. However, one must remember that these are simply ports. They were never going to go and retool their engine to support multithreading. GOW porting studio simply admitted it.
There is also other stuff under the hood that we simply dont know about. Everyone tried to pass off the PS4 as an underpowered PC but thats clearly not the case. It might not be full of secret sauce like the cell but these latest benchmarks prove that the PS4 and now PS5 do have something thats boosting performance or letting the consoles punch above their weight. After all, the PS5 got the same treatment no? Doubt they went in and reengineered everything to take advantage of the PS5 IO and then removed all those changes in the PC version. Pretty sure these ports are based on the PS5 version of the game.