I still wouldn't buy a 6 core if future proofing is my goal.
Rule of thumb is just buy whatever the consoles have. Buying a 16 core CPU and expecting more gaming performance is dumb, yeah.8/12/16 core with slow cores will be slower than 6 core with fast cores from the future (Z4, Z5, AL, RL etc.) for gaming. Buying "many, many" core cpus for the future with gaming in mind was always dumb.
Zen 3 is the last for the am4 socket though. If you have an x570 mobo and want to keep it the whole generation I would not get a 6 core. 5800x should be better than the consoles for the whole generation but a 12 core will give you legroom.8/12/16 core with slow cores will be slower than 6 core with fast cores from the future (Z4, Z5, AL, RL etc.) for gaming. Buying "many, many" core cpus for the future with gaming in mind was always dumb.
Rule of thumb is just buy whatever the consoles have. Buying a 16 core CPU and expecting more gaming performance is dumb, yeah.
There's a reason that chart is in 1080p. If you game at 1440p or above, cpu isn't the limiting factor.
Zen 3 is the last for the am4 socket though. If you have an x570 mobo and want to keep it the whole generation I would not get a 6 core. 5800x should be better than the consoles for the whole generation but a 12 core will give you legroom.
On the other hand if you want to get a whole new system in 2 or 3 years 5600x i the best choice. If you are always upgrading to the latest and greatest it's a waste to go above 5600x for games
8/12/16 core with slow cores will be slower than 6 core with fast cores from the future (Z4, Z5, AL, RL etc.) for gaming. Buying "many, many" core cpus for the future with gaming in mind was always dumb.
It's pretty preliminary testing, because there is hardly any "next-gen only" games, so I wouldn't really update PC based on this test.
That isn't relevant.
When u buy a CPU and stick with it, a 8 core will be more future proof then a 6 core because u have 25% more cores aka performance if it gets demanded.
Comparing a lower gen CPU versus a newer gen CPU is pointless. U compare in the same gen.
This is why i bought a 9900k over a 9600k, because its more future proof. And i already notice that now. Where all threads and cores get used more and more.
Well not necessarily, because the overhead, especially for the CPU are lower in console space, so it's better to have some reserve : )Yep, but major CPU push will only happen if devs go to 30FPS targets once more and abandon any performance modes in their games, I wonder if and when that will happen.
8/16 CPU won't get you any better performance than 6/12 if the application is not able to use more cores efficiently. There are games that put itself on many threads but still don't show difference in multicore cpus above some core count.
You have same gen comparison in OP, 6/12 vs 16/32 (and all in between). No difference.
Yep, but major CPU push will only happen if devs go to 30FPS targets once more and abandon any performance modes in their games, I wonder if and when that will happen.
8/16 CPU won't get you any better performance than 6/12 if the application is not able to use more cores efficiently. There are games that put itself on many threads but still don't show difference in multicore cpus above some core count.
You have same gen comparison in OP, 6/12 vs 16/32 (and all in between). No difference.
Furthermore up to 1.5 cores were reserved for OS functions, people seem to forget that fact.Last gen consoles also had 8 core CPUs, but they were really weak even for the time. A 2 cores/4 threads i3 ran circles around what's in those consoles.
The new consoles have 8 core CPUs too but they happen to be more powerful and support SMT, as you would expect.
Furthermore up to 1.5 cores were reserved for OS functions, people seem to forget that fact.
Although these consoles have a Zen2 CPU, it's not as powerful as Zen2 on PC.
One thing to consider, is that they are clocked almost 1GHz lower.
Also, consoles have one core, two threads, dedicated to the OS and apps. So for games, it ends up being more like a7c14t.
Another thing to consider is that they use GDDR6 controller, that prioritizes bandwidth over latency.
Test on the PS5 SoC, show a latency of over 140ns.
To make things even worse, they only have 4+4MB of L3 cache.
This would put performance of these CPUs, more in like with Zen or Zen+, than with a desktop Zen2.
The Pc counterparts runs at a higher clock that is more useful for games. I bet a R5 3600XT will have no trouble running this gen games. But of course lazy ports still a thing.
We still have more cache, dedicated Ram with lower latency and the difference in clock is considerable enough to alone make most of the difference, but you are right, those factors can disturb the Pc version although I believe bugs like like that can cripple even high end CPUs.Lazy porst, Denuvo, the overhead on nVidia's drivers, MS bloatware on Windows, etc...
Was the scene using all the cores and threads to 100%? If not, then those frame-rate dips can also occur due to underutilization of the CPU cores/threads, giving you the impression that the game has fully tapped the CPU when, in fact, it has not.CPU will always limit you in some scenes no matter the resolution. For example you can get scenes in games like Jedi FO, on this green planet 3600 can't go above 50FPS (I had it), so no matter the res (720/1080/1440/4K if your GPU can handle it) you won't go above 50 FPS in this section.
What do you mean by "won't"? Why can't a 3700X be able to keep you through then gen when games are going to be designed for Zen 2 due to consoles?5800X will keep you trough the whole gen for sure but 3700X won't
There will be PC ports of this gens consoles when crossgen is over, and this gens consoles are using 8 cores.Yep, but major CPU push will only happen if devs go to 30FPS targets once more and abandon any performance modes in their games, I wonder if and when that will happen.
8/16 CPU won't get you any better performance than 6/12 if the application is not able to use more cores efficiently. There are games that put itself on many threads but still don't show difference in multicore cpus above some core count.
You have same gen comparison in OP, 6/12 vs 16/32 (and all in between). No difference.
Where did this “rule of thumb” come from? 8-core consoles came out in 2013 and got their ass whupped by any halfway-decent quad core since day 1.Rule of thumb is just buy whatever the consoles have. Buying a 16 core CPU and expecting more gaming performance is dumb, yeah.
You're still making the fundamental flaw this video was trying to address. Watch again.There will be PC ports of this gens consoles when crossgen is over, and this gens consoles are using 8 cores.
video is wrong. 6 cores isnt futureproof.You're still making the fundamental flaw this video was trying to address. Watch again.
Nothing is future proof as tech always advances, as was stated in the video.video is wrong. 6 cores isnt futureproof.
Wake me up where there’s a single gaming benchmark where 3700X beats 5600X because of more cores.video is wrong. 6 cores isnt futureproof.
Right now 6 core Z3 are killing it:
And this likely won't change anytime soon.
Also console hardware is FIXED so if your CPU can beat the PS5 today you don't need to upgrade in the next 5 years right? Or am I getting something wrong here?Where did this “rule of thumb” come from? 8-core consoles came out in 2013 and got their ass whupped by any halfway-decent quad core since day 1.
Yeah I think that is most likely true. If you have a 6 core CPU that beats PS5 now, I highly doubt we will see a day where some “8-core optimized” PS5 port will perform worse on your PC.Also console hardware is FIXED so if your CPU can beat the PS5 today you don't need to upgrade in the next 5 years right? Or am I getting something wrong here?
Depends on how the game engine is written. The more threaded it is the more you gain from additional cores.8/12/16 core with slow cores will be slower than 6 core with fast cores from the future (Z4, Z5, AL, RL etc.) for gaming. Buying "many, many" core cpus for the future with gaming in mind was always dumb.
Was the scene using all the cores and threads to 100%? If not, then those frame-rate dips can also occur due to underutilization of the CPU cores/threads, giving you the impression that the game has fully tapped the CPU when, in fact, it has not.
In Horizon Zero Dawn, Meridian is the section where it puts the heaviest burden on the CPU - here no matter what it's not possible/easy to get a locked 120fps even at 720p w/ a 3700X - it always hovers around 90-100fps. You may think the 3700X is not up to snuff, and I thought that too until I saw the utilization of individual cores/threads then I realized those dips are simply a case of underutilization of the hardware. The game's coded to utilize a certain amount of threads and that's what it will use.
Individual cores/threads usage in Meridian:
-90-100fps (with GPU utilization sitting below 50%)
-unlocked fps
-50% of 720p
You can see just how many threads are sitting empty with zero work in them and the overall CPU usage is hardly 50% here. Had they coded the game to use more of the 8C/16T CPU, it would have increased the GPU utilization thus increasing the fps.
Remember Zen 2 on consoles is 4x faster than Jaguar CPU according to Microsoft, so a 3700X should be perfectly capable of delivering at least 4x the perf of base PS4 (30fps -> 120fps), but it's only doing a little over 3x of PS4's Jaguar CPU (and btw it's a locked 30fps in Meridian on PS4).
Sure, CPUs like 11700K, 5800X, and even 5600X might reach 120fps here and that'd simply be due to their faster IPC and other architectural improvements brute-forcing their way to reach those fps in an unoptimized code. Looking at the 3700X usage above you can tell there's potential there that is not being fully tapped and that's how the utilization looks like in many games today that are designed with Jaguar CPUs in mind.
EDIT: here's another example using Days Gone where it's averaging under 120fps with plenty of unutilized cores/threads:
What do you mean by "won't"? Why can't a 3700X be able to keep you through then gen when games are going to be designed for Zen 2 due to consoles?
You can see just how many threads are sitting empty with zero work in them and the overall CPU usage is hardly 50% here. Had they coded the game to use more of the 8C/16T CPU, it would have increased the GPU utilization thus increasing the fps.
When the CPU utilization does not tell you the utilization of the CPU
CPU utilization number obtained from operating system (OS) is a metric that has been used for many purposes like product sizing, compute capacity planning, job scheduling, and so on. The current implementation of this metric (the number that the UNIX* "top" utility and the Windows* task manager report) shows the portion of time slots that the CPU scheduler in the OS could assign to execution of running programs or the OS itself; the rest of the time is idle. For compute-bound workloads, the CPU utilization metric calculated this way predicted the remaining CPU capacity very well for architectures of 80ies that had much more uniform and predictable performance compared to modern systems. The advances in computer architecture made this algorithm an unreliable metric because of introduction of multi core and multi CPU systems, multi-level caches, non-uniform memory, simultaneous multithreading (SMT), pipelining, out-of-order execution, etc.
A prominent example is the non-linear CPU utilization on processors with Intel® Hyper-Threading Technology (Intel® HT Technology). Intel® HT technology is a great performance feature that can boost performance by up to 30%. However, HT-unaware end users get easily confused by the reported CPU utilization: Consider an application that runs a single thread on each physical core. Then, the reported CPU utilization is 50% even though the application can use up to 70%-100% of the execution units. Details are explained in [1].
And this likely won't change anytime soon.
Why are the Zen 3's slaughtering the 10XXX and 11XXX's so badly here? Typically they're near margin of error territory, not a murder scene.
How is your memory so high?Because in a test all about cores they picked a game that doesn't care much about cores.
Also probably no optimization for the cpu's.