mckmas8808
Mckmaster uses MasterCard to buy Slave drives
This is awesome analysis.
I doubt that will present itself to be an issue very often. Ps5 is also about 10% faster than a 5700xt so it performing 10% better makes sense to me.Man, the abuse that would get hurled at me for telling 750 Ti was a weaker GPU than PS4 GPU at the time, lol.
Much better when you take PS5's shared bandwidth into account. The 5700 XT gets all 448GB/s to itself, the PS5 doesn't, yet it has a higher avg. fps than a PC with 5700 XT.
Tried both 144Hz as well as 120Hz, enabled in-game Vsync, set framerate limit to 240 and the cutscenes still lock at 60fps. This is so frustrating.Put your display on 120Hz or higher and enable Vsync in the menu. It should uncap 60 softlock.
I am sorry but he didnt pair it with the 3600xt in this second video. I dont know why this is so hard. It takes two seconds to take the GPU out of one PC and put it another.I hope SlimySnake is watching this.
The video just proves how wrong this statement is, for this particular title.
I agree when evaluating perf between multiple PC GPUs you should stick with a single powerful CPU, but as I said in my initial post, at 4K, CPU doesn't really matter much for this game -- especially when the graphics card you're using is something like 2070/5600 XT at 4K which is already dipping into mid-40-30fps. It was very much a GPU-bound stress test at that point.
why so? played kena with 24 fps pre rendered cutscenes. never got frustrated myself. do I have low expectations myself? i dont knowTried both 144Hz as well as 120Hz, enabled in-game Vsync, set framerate limit to 240 and the cutscenes still lock at 60fps. This is so frustrating.
This is an issue with Kojipro games. I recently ran MGSV and even after removing the fps cap from the config files, it would cap at 105 fps. The GPU utilization would be around 35% at native 4k 105 fps so it could easily push another 200 fps but nope. Their engine just couldnt do it.Tried both 144Hz as well as 120Hz, enabled in-game Vsync, set framerate limit to 240 and the cutscenes still lock at 60fps. This is so frustrating.
Oh I keep forgetting people still not forcing vsync off in nvidia control panel after every driver install for some mysterious reason.Tried both 144Hz as well as 120Hz, enabled in-game Vsync, set framerate limit to 240 and the cutscenes still lock at 60fps. This is so frustrating.
Oh I keep forgetting people still not forcing vsync off in nvidia control panel after every driver install for some mysterious reason.
You do not need Vsync with VRR display [cap your frames with RTSS and whoila], I keep repeating myself over and over and yet some random d-asses always keep linking me to some articles written by some illinformed individuals.
No, I meant trying every possible thing to uncap cutscene frame rate, but it still wouldn't, is frustrating. I have no problem playing at 60fps itself.w
why so? played kena with 24 fps pre rendered cutscenes. never got frustrated myself. do I have low expectations myself? i dont know
oh mighty stronk squirrel, i really strongly agree with you. i also saw that vsync recommendation in some websites like blurbuster. eventually, that knowledge and "tip" made it into mainstream discussions and now everyone and their mother will suggest that combo of framelimit + vsync enabled.
for me, whenever i enable vsync i can feel some weird extra lag. i tried even setting the games at 120 fps, it almost feels like the combo is not working as it should. i dont even expect nvidia to actually put in some work for it to work. it to me feels like vsync is always on in some weird ways even if i'm way below my max refresh rate.
at long last, since i didn't actually had any issue with just capping framerate to 120-130, i just disabled vsync and happy ever since. i dont even want to pronounced that thing anymore, i just hate it. its just lag ...
I can guarantee you it will run exactly the same even there as you can see in the vid the GPU is hitting 99% usage.The cutscene he ran where Higgs first shows up is a great comparison point because thats where the PS5 drops the most frames. Lets take a 2070, put it in the same machine as the 3600 and run the tests.
That's the plan. Will see what I can do.What you can do is rush through chapter 2. You can probably make all the deliveries in a couple of hours and head to the place where Higgs first shows up. Or see if you can find a save file. IIRC, you can replay all the bosses and the cutscenes should play before it.
Things I tried:Oh I keep forgetting people still not forcing vsync off in nvidia control panel after every driver install for some mysterious reason.
You do not need Vsync with VRR display [cap your frames with RTSS and whoila], I keep repeating myself over and over and yet some random d-asses always keep linking me to some articles written by some illinformed individuals.
All good atuff and thats even without proper PS5 Code .. DS is in its intrinsics still a ported over Cross Gen Title...
People all the time jump on the "consoles are weak" narrative in the beginning of every gen.
Look the early and infamous DF Comparisions with PS4 vs a GTX 750ti...
PS4 tend loose those Battles although everyone knew that it was the early code and lack of proper API usage what made the GTX750ti pull ahead of PS4.
Also the Advantages of 8GB DDR5 Ram and its need was dismissed in Forums.
PC Folklore was ,, 2GB of Vram will be fine"
Look now how you play Far Cry 6 or HZD on such cards...
PS5 will pull away from even cards like a RTX3070 in a few years on a regular basis.
I can guarantee you it will run exactly the same even there as you can see in the vid the GPU is hitting 99% usage.
That's the plan. Will see what I can do.
Things I tried:
-Put display refresh rate to 120Hz, 144Hz
-Disabled (forced off) both in-game and NVCP Vsync (also tried w/ only in-game Vsync set to on)
-Increased max framerate limit to 240 in-game
-Also used RTSS to limit frames to 120fps (with Vsync off, ofc)
Nothing worked. Cutscenes still render at 60fps. -_-
Well here is my 2070s at 4k max settings and v-synced. I guess Super being the operative word as it crushes the 2070 in the video. It's not overclocked and sits at 1950 to 1980Mhz out of the box so it's not a shit card but not some absurd silicon lottery.
I didn't cherry pick other than looking for bits where the PS5 looked to be miles ahead so I've included plenty of shots where the PS5 wins.
It's behind the PS5 but not by that much and Shadowplay eats 2-4 FPS. If a 3070 is 25-30% faster than a 2070s it should absolutely clean house here unless there is something wrong with the rest of the PC.
The images are 1440p as I just screenshotted my recording but the game was running with these settings:
The UI will be really small on mobile as I didn't bump up the font so I'll list the figures for each set of images.
1)
2070 in video 42fps
PS5 57fps
My 2070s 55fps
2)
2070 in video 41fps
PS5 60fps
My 2070s 54fps
3)
2070 in video 42fps
PS5 60fps
My 2070s 57fps
4)
2070 in video 30fps
PS5 40fps
My 2070s 41fps
5)
2070 in video 38fps
PS5 52fps
My 2070s 54fps
And here is an average from the end of the cutscene:
Avg fps:
2070 in video 41.63
PS5 56.97
My 2070s 54
I'd guess PS5 would be between 2080 and 2080s which is no surprise for a game favoring AMD and still impressive to be fair. Not sure why the 2070 in the video performs so poorly but I'm not going to speculate.
I have a 1060 which is basically better than the PS5. Let me know what you want me to test.P.S We are gonna need another RTX 3070 to come and represent....heck anyone with a 3060Ti should jump in that basically bar for bar a 2080S
Without all the usual console warriors these analysis threads are actually super interesting.
What are the specs of your machine?
P.S We are gonna need another RTX 3070 to come and represent....heck anyone with a 3060Ti should jump in that basically bar for bar a 2080S
God damn, it requires these specific combinations to get it to uncap cutscene frame rate:No you have to keep vsync on in games menu! I know nvcp overwrites this setting, but try it - it won't enable vsync. And maybe try setting "prefered refresh rate" to "highest available" in nvcp.
Err.....I dont think a GTX 1060 can even get to 30fps at 4K let alone trying to average out 60.I have a 1060 which is basically better than the PS5. Let me know what you want me to test.
okay... its my time to shine... then. but i trust Md Ray, i feel like 2070s user is doing a mistake somewhere
are these cutscenes and locations in the beginning sequence in the game?
God damn, it requires these specific combinations to get it to uncap cutscene frame rate:
Force Vsync off in NVCP (I was setting this to "Use 3D app setting" before)
+
Set display refresh from 144Hz to 120Hz in NVCP
+
Set in-game Vsync On
If any one of the above settings differs then it goes back to 60fps.
Thank you!
I have a 3060ti but don't have the game. Feel free to start a go fund me for it.Without all the usual console warriors these analysis threads are actually super interesting.
What are the specs of your machine?
P.S We are gonna need another RTX 3070 to come and represent....heck anyone with a 3060Ti should jump in that basically bar for bar a 2080S
W Werewolfgrandma has a beastly 3060 Ti.P.S We are gonna need another RTX 3070 to come and represent....heck anyone with a 3060Ti should jump in that basically bar for bar a 2080S
The 2070 performs even worse than a 5700 even though it should be on par with the 5700xt. This game is basically like AC Valhalla and other AMD focused titles.I'd guess PS5 would be between 2080 and 2080s which is no surprise for a game favoring AMD and still impressive to be fair. Not sure why the 2070 in the video performs so poorly but I'm not going to speculate.
I have a 3060ti but don't have the game. Feel free to start a go fund me for it.
W Werewolfgrandma has a beastly 3060 Ti.
in the deathloop thread wasn't he saying it was a whole system comparison to validate his conclusions? and now he has gone the other way here, why the change?Amazing he had to do a follow up video just to re iterate its about the gpu.
Are ppl still doubting the gpu in the PS5, or still calling it RDNA 1...
Interesting if so.
Iam not sure what YOU think happend to the GTX 750 ti..I'm not sure what you think happened to the 750ti? It's still slightly ahead of an original ps4 even if nvidia doesn't bring driver optimizations for it anymore.
Maybe there are some VRAM limiting scenarios but in raw performance the 750ti is still more capable than a ps4 and lots of pc gamers still happily game with it.
The ps5 will never be ahead of the 3070. If the ps5 performs above it's expectations in Death Stranding DC all it means is that some game engine optimizations were brought in that haven't made their way to the PC version.
Yeah, I remember getting into an argument with him over that. However, the 2700 is actually a good test to compare the PS5 GPU to these PC GPUs since the PS5 GPU is probably being held back by its paired down zen 2 CPU. Alex never takes that into account when doing his comparisons. So NX gamer is on to something here.in the deathloop thread wasn't he saying it was a whole system comparison to validate his conclusions? and now he has gone the other way here, why the change?
If the DC comes to pc like the rumor mills are saying and the pc gets any improvements like it did with flight sim benefiting from the console work do you think he will do another video to update any pc improvements? yeh me either.Yeah, I remember getting into an argument with him over that. However, the 2700 is actually a good test to compare the PS5 GPU to these PC GPUs since the PS5 GPU is probably being held back by its paired down zen 2 CPU. Alex never takes that into account when doing his comparisons. So NX gamer is on to something here.
i don't think so, console cpus work more efficientlyYeah, I remember getting into an argument with him over that. However, the 2700 is actually a good test to compare the PS5 GPU to these PC GPUs since the PS5 GPU is probably being held back by its paired down zen 2 CPU. Alex never takes that into account when doing his comparisons. So NX gamer is on to something here.
Always great to get more info, I have compared my 2070oc to a 2070super in another video here.Well here is my 2070s at 4k max settings and v-synced. I guess Super being the operative word as it crushes the 2070 in the video. It's not overclocked and sits at 1950 to 1980Mhz out of the box so it's not a shit card but not some absurd silicon lottery.
I didn't cherry pick other than looking for bits where the PS5 looked to be miles ahead so I've included plenty of shots where the PS5 wins.
It's behind the PS5 but not by that much and Shadowplay eats 2-4 FPS. If a 3070 is 25-30% faster than a 2070s it should absolutely clean house here unless there is something wrong with the rest of the PC.
The images are 1440p as I just screenshotted my recording but the game was running with these settings:
The UI will be really small on mobile as I didn't bump up the font so I'll list the figures for each set of images.
1)
2070 in video 42fps
PS5 57fps
My 2070s 55fps
2)
2070 in video 41fps
PS5 60fps
My 2070s 54fps
3)
2070 in video 42fps
PS5 60fps
My 2070s 57fps
4)
2070 in video 30fps
PS5 40fps
My 2070s 41fps
5)
2070 in video 38fps
PS5 52fps
My 2070s 54fps
And here is an average from the end of the cutscene:
Avg fps:
2070 in video 41.63
PS5 56.97
My 2070s 54
I'd guess PS5 would be between 2080 and 2080s which is no surprise for a game favoring AMD and still impressive to be fair. Not sure why the 2070 in the video performs so poorly but I'm not going to speculate.
It was gifted to me just now by an awesome person. I'm off work in 3 hours. Will run stuff thenNuts,
Ill search with friends if anyone hasnt redeemed their code yet.
The more the merrier you know.
Since everyone doesnt trust shills we might as well do extra testing inhouse yeah.
I remember you from the Horizon Zero Dawn thread. There your PC also absolutely crushed NXG's. Really weird.Well here is my 2070s at 4k max settings and v-synced. I guess Super being the operative word as it crushes the 2070 in the video. It's not overclocked and sits at 1950 to 1980Mhz out of the box so it's not a shit card but not some absurd silicon lottery.
I didn't cherry pick other than looking for bits where the PS5 looked to be miles ahead so I've included plenty of shots where the PS5 wins.
It's behind the PS5 but not by that much and Shadowplay eats 2-4 FPS. If a 3070 is 25-30% faster than a 2070s it should absolutely clean house here unless there is something wrong with the rest of the PC.
The images are 1440p as I just screenshotted my recording but the game was running with these settings:
The UI will be really small on mobile as I didn't bump up the font so I'll list the figures for each set of images.
1)
2070 in video 42fps
PS5 57fps
My 2070s 55fps
2)
2070 in video 41fps
PS5 60fps
My 2070s 54fps
3)
2070 in video 42fps
PS5 60fps
My 2070s 57fps
4)
2070 in video 30fps
PS5 40fps
My 2070s 41fps
5)
2070 in video 38fps
PS5 52fps
My 2070s 54fps
And here is an average from the end of the cutscene:
Avg fps:
2070 in video 41.63
PS5 56.97
My 2070s 54
I'd guess PS5 would be between 2080 and 2080s which is no surprise for a game favoring AMD and still impressive to be fair. Not sure why the 2070 in the video performs so poorly but I'm not going to speculate.
hmm. very interesting. I never new about this. Those benchmarks are nuts.i don't think so, console cpus work more efficiently
here are the problems
- baseline api optimization. non existent on pc
- nvidia driver overhead.
i have very relevant test results from an anandtech user,
Page 4 - Discussion - [HWUB] Nvidia has a driver overhead problem. . .
Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.forums.anandtech.com
practically, if you pair a ryzen 2700x with an amd gpu instead of an nvidia gpu, you magically get %20-25 more CPU bound frames due to nvidia overhead
since ps5/xbox all run on amd hardware, i'm pretty sure the 2700 is at a huge disadvantage here when paired with an nvidia gpu
then again however, its not the only factor. it is clear that console games are less cpu bound in general thanks to their API and stuff. remember again, battlefield 5 and 2042 runs 45-60 fps on a ps4 which as 1.6 ghz jaguar cores on 64 player maps. on the other end of the spectrum, fx 8350 struggles to push barely 45-50 fps in bf5 even when clocked at 4.5 ghz and at lowest settings ( i know it has fake cores and ps4 has real cores but then again, its 3 times the freq difference and ps4 can only allocate 6 cores to games so it kinda evens out)
All good info and backs up what I am covering in the video and in previous ones, the thing to remember here though is all these are running at north of 60-70fps and some gains are from 96 to 120's so the gap is and would be smaller at the 30 or 60fps target that almost all consoles games are capped at. The CPU Driver cost would not be so visible or prominent at lower fps as the more you push the thread/IPC, the more the overhead impacts final performance.i don't think so, console cpus work more efficiently
here are the problems
- baseline api optimization. non existent on pc
- nvidia driver overhead.
i have very relevant test results from an anandtech user,
Page 4 - Discussion - [HWUB] Nvidia has a driver overhead problem. . .
Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.forums.anandtech.com
practically, if you pair a ryzen 2700x with an amd gpu instead of an nvidia gpu, you magically get %20-25 more CPU bound frames due to nvidia overhead
since ps5/xbox all run on amd hardware, i'm pretty sure the 2700 is at a huge disadvantage here when paired with an nvidia gpu
then again however, its not the only factor. it is clear that console games are less cpu bound in general thanks to their API and stuff. remember again, battlefield 5 and 2042 runs 45-60 fps on a ps4 which as 1.6 ghz jaguar cores on 64 player maps. on the other end of the spectrum, fx 8350 struggles to push barely 45-50 fps in bf5 even when clocked at 4.5 ghz and at lowest settings ( i know it has fake cores and ps4 has real cores but then again, its 3 times the freq difference and ps4 can only allocate 6 cores to games so it kinda evens out)
;
this is also why their 5600xt + 3600 system is much more stable in framerates and have a bit of more avg. compared to their. 2700+2070 system
you can see %99 gpu util. it can be meaningless. i've seen both %99 in certain games with various different cpus and avg. framerate can differ even in this condition. so 2700 probably soft bottlenecks the 2070 even though it appears to be there's no bottleneck. on the other hand 3600+5600xt system is enjoying no driver overhead bottleneck and pushes even more solid cpu performance, and of course, zen 2 by itself is superior to zen+
in other words, even if we accept "2700x" as a substitute for ps5 cpu, you still have to use a cpu that is at least %25 faster than 2700x if you're going to compare nvidia gpu with the consoles to get the best optimal cpu bound situation. because even if 2700x can perform like a ps5 cpu, it will still be at a %25 disadvantage, which can be huge, since with such high end gpus, there's no room for error and performance loss
in this case, i don't refute you, it is clear that this scene is heavily GPU bound. even a ryzen 5950x clocked at 5 ghz with 4000 mhz ram in tow cannot produce more than 49-50 fps in that scene. it is simple, GPU is chocking and pushes a framerate avg. equal to the PS5 in that particular scene and situationAll good info and backs up what I am covering in the video and in previous ones, the thing to remember here though is all these are running at north of 60-70fps and some gains are from 96 to 120's so the gap is and would be smaller at the 30 or 60fps target that almost all consoles games are capped at. The CPU Driver cost would not be so visible or prominent at lower fps as the more you push the thread/IPC, the more the overhead impacts final performance.
As you say Console CPU's are not multi-purpose and do not need such complexity or (until now) target higher than 60 unless VR.
Whats your average frame rate now?
good one
Keep that save file. And send it to W Werewolfgrandma and Low Moral Fibre so they can run the same test. That scene drops below 60 fps a lot of times on the PS5. Should be the perfect test for cards like the 3060 Ti which is more powerful than the 2080 Super.good one
Now on to the Higgs cutscene, this might take some time though.
If remote install works this time it will be ready when I get home. If not I'll have to download it then. 1.5gb internet so it shouldn't take too long.Keep that save file. And send it to W Werewolfgrandma and Low Moral Fibre so they can run the same test. That scene drops below 60 fps a lot of times on the PS5. Should be the perfect test for cards like the 3060 Ti which is more powerful than the 2080 Super.
He specifically addressed what the follow up video was for, about.in the deathloop thread wasn't he saying it was a whole system comparison to validate his conclusions? and now he has gone the other way here, why the change?
Actually, Id highly recommend that you just burn through Chapter 2. You dont have to use his save file, but please get out of that Chapter 2 area asap. Everyone gets stuck there doing side missions and then loses interest. The game does not start until Chapter 3.If remote install works this time it will be ready when I get home. If not I'll have to download it then. 1.5gb internet so it shouldn't take too long.
Edit
I actually do want to play the game. I always play open world with as little HUD add possible and this game seems built for that. So if it has major Spoilers I'll just play it to that point tonight if it's not too far.
I have the game but not 3060ti. Can someone fund me?I have a 3060ti but don't have the game. Feel free to start a go fund me for it.
Thats actually really impressive for a 1080ti.my contribution
i7 7700 + 1080ti at 4k max setting : the intro is 60fps 80% of the time
there are times at 50-53 and 57
and the specific moment is 43fps....
I would also like to see a 6800 or 6800xt with its clocks turned down to test the frequency vs CU theory.hmm. very interesting. I never new about this. Those benchmarks are nuts.
That said, this would be a great CPU to test AMD cards like the 6600xt and the 6700xt. The two RDNA 2.0 cards that are the closest to the PS5 and XSX. The 2700 performs similar to the PS5 CPU that went on the market recently. The 4800U I believe. So you can use this CPU to do some very interesting comparisons that will either confirm or bust any myth of console coding to the metal. If the cards perform identical when downclocked to match the PS5 and XSX tflops, we can say there is no secret sauce in the PS5's low level API. This will also help point out any advantages the PS5 I/O might have over PC and the XSX.
I am surprised that no one seems to be interested in doing those comparisons. Alex is happy jerking off to 2060 Super comparisons to prove his point, but I think there are far more interesting findings to be had if they stick with these two RDNA 2.0 cards.
maybe its a general ampere bottleneck people, we may not know
it is clear that rtx 3000 series brought up huge numbers like 10k cores and such. but not every core is created equal, never forget that
in the end sadly, from gtx 1060 to rtx 3090, nvidia has a problem: frequency. all them gpus run at 1.9 ghz. at least amd made some strides there. games always like frequency.
maybe we're seeing an internal ampere bottleneck here that somehow causes 3070 to not shine or something. but that's an nvidia problem, in the end, ps5 runs at 2.2-2.3 ghz and rdna2 cards are flying high above 2.5 ghz+
maybe in that particular transition scene, frequency becomes the bottleneck. as i've said in my other post, not every %99 gpu is created equal. you can see %99 gpu util in ac:valhalla, but gpu may be consuming 160w. what does 160w mean for a 220w tdp gpu? it means some parts of the gpu is internally bottleneck and not working at all. but core load may present itself as %99. so in this particular case, maybe his 2070super gets higher "computational" usage and 3070 kinda gets underused. but as i've said, its a problem with ampere architecture. i've noticed this problem when i did a comparative benchmark with against my friend's gtx 1080ti in nba 2k19. practically, 3070 failed to scale beyond 1080ti performance in certain old games
in this video, this valuable person talks about something is being awkward on ampere and claims that nvidia's software team can tackle the issue with software optimizations. then again, stuff like this makes me uneasy, if a gpu's performance depends on the game developer and nvidia, its not good. it may even implicate that the ampere may end up like Kepler: alone and forgotten and left behind... this is why GCN cards aged better,because AMD kept along with the same architecture over the years.
nvidia however constantly shifts architectures and im pretty sure that is causing troubles for game development
You know something I have brought up time and time again is how I really dislike these 1% low benchmarks Alex does in his games when he finds the worst possible point, and uses that to base all of his benchmarks. You can see from your average that the actual framerate is much higher. 46 vs 50 1% compared to 56 vs 65 on Average.good one
I re-ran/benchmarked the beginning cutscene now with an uncapped frame rate, those dips into 49-55fps are once again happening the same way as before, in the exact same spot (even your card shows the same). But the avg. fps is now around 14% higher than PS5 owing to the unlocked fps.
This time I might upload the vid to YT.
PS5: 49fps
Avg. FPS: 56.91
Here's the OG benchmark with a cap:
D3D12: 50fps
Avg. FPS: 59fps
Here's the uncapped one. This time I also added the 1% low min fps data:
D3D12: 50fps
Avg. FPS 65fps
1% Low: 46fps
Now on to the Higgs cutscene, this might take some time though.