• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel's 10th Gen Comet Lake-S CPUs: up to 10 cores, 125W TDP; Cascade Lake-S 18 cores, 165W TDP

SonGoku

Member
How is this unreasonable?
Because you won't see the same performance delta at 1080p and above, you know? real gaming scenarios, I thought the reason to get intel was gaming not benchmarks
4

A few extra frames over 200fps at 720p its a worthless advantage
 
Last edited:
Intel is still beating out AMD in most workloads. It will be interesting to see if that will be the case with this years CPUs. Most legacy apps I use favor faster single core performance. And I have nothing on my system that uses more than 8 cores.
 
If you're going there 3600 has very similar gaming performance with 3700x and costs $120 less.
Or 9900K will have higher performance than 3950x at $250 less.

No, the 3950X is going to be like 50% faster minimum in productivity/multi-threaded benches than the 9900K, seeing as the 3900X is already 45% faster in some tests. More like 60-70% faster than that last throw of the dice on 14nm Intel chip. :messenger_relieved:

But I suspect you're clinging on to that gaming margin of error performance lead at 1440p using a 2080 Ti.
 
Last edited:

SonGoku

Member
If you're going there 3600 has very similar gaming performance with 3700x and costs $120 less.
Or 9900K will have higher performance than 3950x at $250 less.
But you are losing cores to fit this apples to oranges comparison...
The 3600 has 2 less cores compared to 3700 and the 9900 has 4 less cores compared to 3950. An apples to apples comparison would be CPUs with the same amount of threads

Next gen games will more efectively take advantage of extra cores than a minisminuscule ST perf delta at low resolutions. Not to mention you are missing out on productivity as well.
 
Last edited:

Most apps that most people use are legacy apps. photoshop for example is one of many legacy apps that doesn't give a fuck about a lot of cores. It just want faster single core performance. Like even in 2019, its abysmal how few games actually use more than 4 cores. And it shows in the adaption rate when you see Steam hardware surveys. Most gamers are still on 2-4 cores; https://store.steampowered.com/hwsurvey/cpus/

This just further incentivizes devs to not take advantage of anything more than that. Only 2% have 8-cores. Thats pathetic.
 
But you are losing cores to fit this apples to oranges comparison...
The 3600 has 2 less cores compared to 3700 and the 9900 has 4 less cores compared to 3950. An apples to apples comparison would be CPUs with the same amount of threads

Next gen games will more efectively take advantage of extra cores than a minisminuscule ST perf delta at low resolutions. Not to mention you are missing out on productivity as well.

Prob a typo but the 3950X is 16-cores so you lose, only 8 cores!!! So it's even worse comparison from Mr 'don't ask me about computers'.
 

Leonidas

Member
How do you know the I9 9900k will beat the 3950x?
Because anyone can look at the reviews of the 6, 8 and 12 core Ryzen CPUs and easily come to the conclusion that that the 3950x doesn't have a realistic shot at beating the 9900K or even the ~$400 cheaper 9700KF in gaming.
Ryzen 3600 is the only CPU I would buy out of the entire lineup (which will likely perform close to the $750 3950X in gaming), but that's only if I had a CPU budget of only $200.
 
Last edited:

Ivellios

Member
Because anyone can look at the reviews of the 6, 8 and 12 core Ryzen CPUs and easily come to the conclusion that that the 3950x doesn't have a realistic shot at beating the 9900K in gaming.
Ryzen 3600 is the only CPU I would buy out of the entire lineup (which will likely perform close to the $750 3950X in gaming), but that's only if I had a CPU budget of only $200.

The problem with the Ryzen 3600 is that although it offers amazing performance for its price, i dont know if it will be enough once next gen consoles launch next year.
 

Leonidas

Member
The problem with the Ryzen 3600 is that although it offers amazing performance for its price, i dont know if it will be enough once next gen consoles launch next year.

That's my problem with it too. There is caveat when it comes to gaming with all the new Ryzen CPUs. If you spend $200 you have good performance for the cost today but will it hold tomorrow? If you're spending more than $300 on a CPU you simply get better gaming performance on Intel. And no CPU no matter how much you spend currently matches the 9900K in gaming.
 
Last edited:
Next gen games will more efectively take advantage of extra cores than a minisminuscule ST perf delta at low resolutions. Not to mention you are missing out on productivity as well.

They said that generations ago. 4+ Core adoptation has been slow. Remember when PS3s Cell engine with all its core was supposed to revolutionize the CPU core market on PC?

I just think that for so many years the next gen games were supposed to take better advantage of more cores. And its still so few games that really do. Which is bad. If all developers made games to take advantage of 12-16 cores, performance could be pushed a lot further right? Its obviously not a ideal scenario to keep staying stuck in the past by just focusing on single core performance to run your apps while all your other cores are mostly unused.

I hope things will change. If I go with a AMD option I can expect a 30-40% performance drop in After Effects compared to Intel. That is a massive performance drop, and it makes it hard for people like me who want to check out AMD, because I depend on the apps I need.

And I know this is not AMDs fault. Its fucking Adobe who keeps making everything more bloated. but the lack of competition has not given them an incentive to really rewrite these apps from the ground up like they should.
 

SonGoku

Member
The problem with the Ryzen 3600 is that although it offers amazing performance for its price, i dont know if it will be enough once next gen consoles launch next year.
That's the stupidity of it all, $300+ CPUs offer more than enough performance for current gen games, if your plan is to future proof your CPU for next gen, extra cores/threads is the way to go
Imagine recommending an 8 thread 9700k over the 16 threads 3700x or the 16 thread 9900k over the 24 threads 3900x. Only a shill would give such terrible anti consumer advice.

If you are price conscious 3700x is the sweet spot in terms of performance and future proofing, if you dedicate a higher budget to future proofing 3900x is the way to go.
 
Last edited:

PhoenixTank

Member
Most apps that most people use are legacy apps. photoshop for example is one of many legacy apps that doesn't give a fuck about a lot of cores. It just want faster single core performance. Like even in 2019, its abysmal how few games actually use more than 4 cores. And it shows in the adaption rate when you see Steam hardware surveys. Most gamers are still on 2-4 cores; https://store.steampowered.com/hwsurvey/cpus/

This just further incentivizes devs to not take advantage of anything more than that. Only 2% have 8-cores. Thats pathetic.
You see the chicken/egg problem here, right? Mainstream hardware has been limited to 4 cores until Ryzen 1XXX and the 8700k appeared, in mid/early 2017 and late 2017 respectively. Hardware cycles take time.
They said that generations ago. 4+ Core adoptation has been slow. Remember when PS3s Cell engine with all its core was supposed to revolutionize the CPU core market on PC?

I just think that for so many years the next gen games were supposed to take better advantage of more cores. And its still so few games that really do. Which is bad. If all developers made games to take advantage of 12-16 cores, performance could be pushed a lot further right? Its obviously not a ideal scenario to keep staying stuck in the past by just focusing on single core performance to run your apps while all your other cores are mostly unused.

I hope things will change. If I go with a AMD option I can expect a 30-40% performance drop in After Effects compared to Intel. That is a massive performance drop, and it makes it hard for people like me who want to check out AMD, because I depend on the apps I need.

And I know this is not AMDs fault. Its fucking Adobe who keeps making everything more bloated. but the lack of competition has not given them an incentive to really rewrite these apps from the ground up like they should.
I feel you on the Adobe side, but if it is any consolation, we're finally hitting the point where multicore workloads have to happen or there will be a hard performance wall. Moore's law can't continue much longer with the same approach shrinks are getting more difficult and more expensive.
Of course we have Amdahl's law to contend with too so parallelization isn't a magic bullet, so IPC improvements are going to be more important too.
8 cores in both next gen consoles, and this time not underpowered... and not a quirky nightmare like Cell. If it doesn't happen I just can't see a route forward for these industries.
 

SonGoku

Member
They said that generations ago. 4+ Core adoptation has been slow. Remember when PS3s Cell engine with all its core was supposed to revolutionize the CPU core market on PC?
I dont base it on anybodies words but the tech itself
CELL was of great aid to GPUs but a terrible general purpose processor if anything it paved the way forward for GPGPU Compute.

Current gen games engines are optimized around 6 weak threads (6 jaguar cores available to ps4/xbox devs). Quad Cores from intel more than doubled Jaguars performance, there was no need to to parallelize engines any further to get good results (60fps+) on PC.

Next gen consoles will be equipped with 3700x tier CPUs at ~3.2GHz, in 30 fps games that max out consoles CPUs, double the CPU performance would be needed to brute force 60fps on PCs. The only way to realistically double the 3700x performance is with more threads.
 
Most apps that most people use are legacy apps. photoshop for example is one of many legacy apps that doesn't give a fuck about a lot of cores. It just want faster single core performance. Like even in 2019, its abysmal how few games actually use more than 4 cores. And it shows in the adaption rate when you see Steam hardware surveys. Most gamers are still on 2-4 cores; https://store.steampowered.com/hwsurvey/cpus/

This just further incentivizes devs to not take advantage of anything more than that. Only 2% have 8-cores. Thats pathetic.

AMD gonna change that, next gen gonna help AMD settle down, it will be adopted by the majority sense price/performance is unbeatable, and people do notice that, mainstream hardware gonna change just need some time.
200$ for 6 cores 12 threads 35Mb cache i mean cmon
 

Ivellios

Member
That's my problem with it too. There is caveat when it comes to gaming with all the new Ryzen CPUs. If you spend $200 you have good performance for the cost today but will it hold tomorrow? If you're spending more than $300 on a CPU you simply get better gaming performance on Intel. And no CPU no matter how much you spend currently matches the 9900K in gaming.

And this is where i disagree with you, with $329 i get a Ryzen 3700x 8c/16t (The same as the PS5). Meanwhile i have to spend $500 on intel for this same setup, plus an additional cooler. The perfomance gains on the i9-9900k is negligible and not worth it the additional $170 + Cooler.


That's the stupidity of it all, $300+ CPUs offer more than enough performance for current gen games, if your plan is to future proof your CPU for next gen, extra cores/threads is the way to go
Imagine recommending an 8 thread 9700k over the 16 threads 3700x or the 16 thread 9900k over the 24 threads 3900x. Only a shill would give such terrible anti consumer advice.

If you are price conscious 3700x is the sweet spot in terms of performance and future proofing, if you dedicate a higher budget to future proofing 3900x is the way to go.

This is exactly why im planning on spending a little more and get the 3700x. It might be a waste for now, but tomorrow it might save me the trouble of upgrading the CPU.
 
That's my problem with it too. There is caveat when it comes to gaming with all the new Ryzen CPUs. If you spend $200 you have good performance for the cost today but will it hold tomorrow? If you're spending more than $300 on a CPU you simply get better gaming performance on Intel. And no CPU no matter how much you spend currently matches the 9900K in gaming.
In my experience, you have it backwards. If you are spending more than $300 on a CPU you want it to last more than a year or even a couple years. I have always spent more while looking forward, while there is no reliable way to future-proof anything in tech, having a level head and thinking past bar charts in current games helps a lot.

When other people were buying Core 2 Duo and/or Quad and overclocking it to the sky, I bought into Nehalem at a higher price and locked in what ended up being more than 5 years on Core i7-950. After that, while other people were buying 6700K Skylakes and pushing close to 5.0 ghz already then, I bought into HEDT and have now spent almost 4 years on a 5820K running at 4.3. In both cases, I took a lower core clock and less per-thread performance in order to ensure I had more cores than was mainstream at the time. And in both cases, having more cores at a lower clock ended up being more future-proof than having fewer cores at a higher clock.

Those people who thought they were being smart buying i5-4590K instead of i7-4790K? They got fucked because it turned out games ended up getting multi-threaded a lot more quickly than anyone expected and 4c/4t without HT was like running the race with only 1 leg. The people on 4790K's today are starting to go through the same thing, as 4c/8t is starting to reach it's limits and most people buying into a gaming ecosystem are thinking of what's next, not what's now.

So let's use our galaxy brains and think about this a moment.


8bcb71a0-71a1-46c4-a9e5-280aa97bc460.png


Now you look at this and you think to yourself, OH MY GOD AMD GETS KILLED BY INTEL STILL IN GAMES I'D BETTER BUY A 9900K RIGHT NOW

But then you realize that you haven't played a game at 1080p on your fire-breathing LED light show Master Race God Machine in like 8 years now. And then well...

3791b98d-ba87-4fc2-9a3d-72b63e97b198.png


Well...that seems a bit more realistic. Yeah. And then if you realize that you haven't been a resolutionlet for the past 4 years and you're gaming in 4K like I am, this is what you see:

fdcfd91c-82a7-4f7b-9347-e577f18e01d9.png


Hexus.net are the only people crazy enough to "benchmark" CPU's at 4K resolution, so don't blame me for this. I didn't think of using a 100% GPU-bound situation to prove a point, but it's proven nonetheless.

What I'm saying is that in any real-world situation, more likely than not you are GPU-bound to some extent and the tests in 1080p where the 9900K kills the 3900X are only relevant if you are one of those CSGO players who swears you need to be at 400 fps or you can't win. Because it's literally irrelevant to everyone else who is a PC gamer.

But then you add in doing actual other stuff while gaming, and then the picture looks more muddled.



So let's look at gaming while streaming, an example making the CPU do more than just run the game.

hDoSdxl.png


Wow, huh. The 9900K's entire lead against the 3900X just evaporates when you make it stream your gameplay at the same time you are playing the game. That's a big hmmm. (Of course, if you are truly a galaxy brain, you are using the NVENC encoder on Shadowplay or OBS for your streaming in which case your result is that bar at the top. But I'm just demonstrating what happens when 12c/24t takes on 8c/16t and doing more than just playing the game.)

Also, for people who think they are smart by buying a 9700K instead of a 9900K or 3900X, well....you're making the same mistake those people who thought they were smart buying an i5-4590K instead of an i7-4790K back in the day.

QthAQfE.png


Here we can see the 8c/8t 9700K turning into a stuttery mess when trying to play a game and stream it at the same time. The 8c/16t 9900K and the 12c/24t 3900X are casual as fuck doing this. Don't be a brainlet and buy the 9700K instead of the 3700X, 3900X, or 9900K and think you're future-proof because you're not.

So in summary, what have we learned?

(1) More cores + more threads at a lower clock speed is more future-proof than fewer cores + fewer threads at higher clock speed
(2) Real-world gaming consists of way more than looking at bar charts
(3) Smart people always think about the future 1-2 year down the road, not right this second when investing in a new gaming machine

If we take these lessons into consideration, it's fairly clear who is more future-proof here despite losing at bar charts in 1080p.
 
In my experience, you have it backwards. If you are spending more than $300 on a CPU you want it to last more than a year or even a couple years. I have always spent more while looking forward, while there is no reliable way to future-proof anything in tech, having a level head and thinking past bar charts in current games helps a lot.

When other people were buying Core 2 Duo and/or Quad and overclocking it to the sky, I bought into Nehalem at a higher price and locked in what ended up being more than 5 years on Core i7-950. After that, while other people were buying 6700K Skylakes and pushing close to 5.0 ghz already then, I bought into HEDT and have now spent almost 4 years on a 5820K running at 4.3. In both cases, I took a lower core clock and less per-thread performance in order to ensure I had more cores than was mainstream at the time. And in both cases, having more cores at a lower clock ended up being more future-proof than having fewer cores at a higher clock.

Those people who thought they were being smart buying i5-4590K instead of i7-4790K? They got fucked because it turned out games ended up getting multi-threaded a lot more quickly than anyone expected and 4c/4t without HT was like running the race with only 1 leg. The people on 4790K's today are starting to go through the same thing, as 4c/8t is starting to reach it's limits and most people buying into a gaming ecosystem are thinking of what's next, not what's now.

So let's use our galaxy brains and think about this a moment.


8bcb71a0-71a1-46c4-a9e5-280aa97bc460.png


Now you look at this and you think to yourself, OH MY GOD AMD GETS KILLED BY INTEL STILL IN GAMES I'D BETTER BUY A 9900K RIGHT NOW

But then you realize that you haven't played a game at 1080p on your fire-breathing LED light show Master Race God Machine in like 8 years now. And then well...

3791b98d-ba87-4fc2-9a3d-72b63e97b198.png


Well...that seems a bit more realistic. Yeah. And then if you realize that you haven't been a resolutionlet for the past 4 years and you're gaming in 4K like I am, this is what you see:

fdcfd91c-82a7-4f7b-9347-e577f18e01d9.png


Hexus.net are the only people crazy enough to "benchmark" CPU's at 4K resolution, so don't blame me for this. I didn't think of using a 100% GPU-bound situation to prove a point, but it's proven nonetheless.

What I'm saying is that in any real-world situation, more likely than not you are GPU-bound to some extent and the tests in 1080p where the 9900K kills the 3900X are only relevant if you are one of those CSGO players who swears you need to be at 400 fps or you can't win. Because it's literally irrelevant to everyone else who is a PC gamer.

But then you add in doing actual other stuff while gaming, and then the picture looks more muddled.



So let's look at gaming while streaming, an example making the CPU do more than just run the game.

hDoSdxl.png


Wow, huh. The 9900K's entire lead against the 3900X just evaporates when you make it stream your gameplay at the same time you are playing the game. That's a big hmmm. (Of course, if you are truly a galaxy brain, you are using the NVENC encoder on Shadowplay or OBS for your streaming in which case your result is that bar at the top. But I'm just demonstrating what happens when 12c/24t takes on 8c/16t and doing more than just playing the game.)

Also, for people who think they are smart by buying a 9700K instead of a 9900K or 3900X, well....you're making the same mistake those people who thought they were smart buying an i5-4590K instead of an i7-4790K back in the day.

QthAQfE.png


Here we can see the 8c/8t 9700K turning into a stuttery mess when trying to play a game and stream it at the same time. The 8c/16t 9900K and the 12c/24t 3900X are casual as fuck doing this. Don't be a brainlet and buy the 9700K instead of the 3700X, 3900X, or 9900K and think you're future-proof because you're not.

So in summary, what have we learned?

(1) More cores + more threads at a lower clock speed is more future-proof than fewer cores + fewer threads at higher clock speed
(2) Real-world gaming consists of way more than looking at bar charts
(3) Smart people always think about the future 1-2 year down the road, not right this second when investing in a new gaming machine

If we take these lessons into consideration, it's fairly clear who is more future-proof here despite losing at bar charts in 1080p.


 

Leonidas

Member
In my experience, you have it backwards. If you are spending more than $300 on a CPU you want it to last more than a year or even a couple years.

Sure you want it to last but you can't say for certain which CPU will be better in a few years time. And where did I say otherwise?

As for 9700K vs 3700X. 9700K is better today in gaming. That's all we know, anything else is mindless speculation which I'm not interested in.

"Futureproofing" worked out poorly for buyers of 1800X. $500 at launch and now have worse gaming performance than a $200 CPU.
This scenario could easily happen again.

Future proofing Ryzen for gaming has never made sense. You're basically giving up performance today for a chance that maybe in the future games might match or exceed Skylake at 5 GHz. Seems very strange.
 
Last edited:
i am not starting this dumb ass list wars with you plenty of info to find online why the 9900k is still king in gaming.
You’re comparing a brand new AMD chip to an intel chip from 2018. I suspect Intel will exceed the performance of the new Ryzen lineup even at 14nm. 7nm is all AMD has to give them their boost in performance. Once Intel and nVidia get there, AMD will most likely need to move to 6nm or 5nm to stay ahead. They’re just not as efficient.
 

JohnnyFootball

GerAlt-Right. Ciriously.
You’re comparing a brand new AMD chip to an intel chip from 2018. I suspect Intel will exceed the performance of the new Ryzen lineup even at 14nm. 7nm is all AMD has to give them their boost in performance. Once Intel and nVidia get there, AMD will most likely need to move to 6nm or 5nm to stay ahead. They’re just not as efficient.
You're wrongly assuming that AMD will just stand pat at 7nm and not tweak and refine. With each generation the core will continue to get more and more optimized. See Skylake to Comet Lake for instance. Now, I don't expect AMD to stay on 7nm as long as Intel has at 14nm.
 
You're wrongly assuming that AMD will just stand pat at 7nm and not tweak and refine. With each generation the core will continue to get more and more optimized. See Skylake to Comet Lake for instance. Now, I don't expect AMD to stay on 7nm as long as Intel has at 14nm.
I understand, but intel and nVidia are much more efficient with power than AMD as AMD is just barely beating both intel high end and nVidia mid range at 7nm. Now their performance/dollar is great. But they won’t hold this crown for long.
 
I understand, but intel and nVidia are much more efficient with power than AMD as AMD is just barely beating both intel high end and nVidia mid range at 7nm. Now their performance/dollar is great. But they won’t hold this crown for long.

why is that ? AMD is RISING again, more money, more R&D, more talent ... :pie_diana:
 

PhoenixTank

Member
You’re comparing a brand new AMD chip to an intel chip from 2018. I suspect Intel will exceed the performance of the new Ryzen lineup even at 14nm. 7nm is all AMD has to give them their boost in performance. Once Intel and nVidia get there, AMD will most likely need to move to 6nm or 5nm to stay ahead. They’re just not as efficient.
That is literally the plan on the CPU side. By the time Intel get their 10nm (basically equivalent to TSMC 7nm) to decent yields, AMD/TSMC will be on 7nm+. When Intel get to their 7nm node AMD/TSMC are hoping to be on 5nm.
I don't see why that matters as some sort of negative here? Different situation in the GPU market where AMD are ahead on process but behind on performance.
Ice Lake is meant to have better IPC than Zen 2 & Coffee Lake. Zen 2 seems to have blipped Coffee Lake in IPC too. Power efficiency of Zen1/+ was pretty damn close, on an inferior process, for AMD too.
 
Last edited:
Can anyone answer this?
The correct answer is that neither Intel nor AMD actually hit their rated TDP's when all cores are loaded. So how they rate TDP is almost irrelevant when a 95W rated 9900K is pushing 160W loaded and a 105W rated 3900K is pushing 140W loaded. TDP is like the Pirate's Code at this point, they are just suggestions.
 
Last edited:

truth411

Member
Yes, I understand the concept of testing CPUs in CPU-bound scenarios.

If you were less pissy and actually took the time to read, you'd understand that our posts are saying that the i7/i9 performance advantage evaporates as soon as you leave those CPU-bound scenarios and actually use real-world settings.


More reading comprehension, less Intel fanboyism plz

Exactly, I mean c'mon this isn't hard.
 
Last edited:
The correct answer is that neither Intel nor AMD actually hit their rated TDP's when all cores are loaded. So how they rate TDP is almost irrelevant when a 95W rated 9900K is pushing 160W loaded and a 105W rated 3900K is pushing 140W loaded. TDP is like the Pirate's Code at this point, they are just suggestions.
Holy shit! Imagine that 165W CPU! 300W?? :messenger_fearful:
 

JohnnyFootball

GerAlt-Right. Ciriously.
If money is no object, the the 9900K is, at this moment the fasted CPU for gaming. If developers start utilizing the additional cores of the 3900X, then the 9900K is likely going to get left in the dust.

If money is an object and you want an 8-core CPU, then there is simply no reason to go with a 9900K or 9700K when the 3700X is a much better buy as the $150 savings (compared to a 9900K) can be put toward a better GPU. As for the 9700K, it's fine for now, but many games are taking advantage of the virtual threads and the lack of virtual threads according to many reviewers does affect the 1% lows.

I am hesitant to recommend anything less than an 8-core CPU as AMD has now put 8-cores at a mainstream price point and it's only a matter of time before games start fully taking advantage of it.

The $150 savings can allow you to go from a 2060S to a 2070S or a 2070S to a 2080S. $150 might also be the savings that make it possible to go from a 2080 to a 2080 Ti.
 
Last edited:
Holy shit! Imagine that 165W CPU! 300W?? :messenger_fearful:
TDP matters a lot more for servers and corporations take rated TDP a lot more seriously because they stuff these CPU's into rackmounts by the hundreds. AFAIK the rated TDP's for Intel's server-class CPU's are accurate even at full load. Funnily enough, core clocks are much lower on Skylake-X than on any of the consumer lines. I can't imagine why that would be. The inflation of core clocks but publishing essentially imaginary made-up TDP's at the consumer level is a by-product of this core count war that Intel has gotten themselves into with AMD and right now AMD has a process advantage.

Back when Intel could keep the consumers down at 4-6 cores it wasn't an issue but now consumers are demanding 8 cores and now AMD is pushing 12 and 16 cores onto the mainstream desktop and there's no magic that can make a 95W 4-core 6700K dissipate the same amount of heat as the 95W-rated 8-core 9900K. That's not how physics works.
 
Last edited:

SonGoku

Member
money is no object, the the 9900K is, at this moment the fasted CPU for gaming. If developers start utilizing the additional cores of the 3900X, then the 9900K is likely going to get left in the dust.
Depends of target resolution, I could only recommend the 9900k to people with 1080p high refresh rate monitors
For 1440p and above 9900k holds a insignificant advantage, you'd be much better served buying a 3700x and a better GPU or SSD with the extra $170 plus cooler money

btw have any tech sites done OC of 3700x with those beast Noctua coolers?
 
Last edited:

Kenpachii

Member
In my experience, you have it backwards. If you are spending more than $300 on a CPU you want it to last more than a year or even a couple years. I have always spent more while looking forward, while there is no reliable way to future-proof anything in tech, having a level head and thinking past bar charts in current games helps a lot.

When other people were buying Core 2 Duo and/or Quad and overclocking it to the sky, I bought into Nehalem at a higher price and locked in what ended up being more than 5 years on Core i7-950. After that, while other people were buying 6700K Skylakes and pushing close to 5.0 ghz already then, I bought into HEDT and have now spent almost 4 years on a 5820K running at 4.3. In both cases, I took a lower core clock and less per-thread performance in order to ensure I had more cores than was mainstream at the time. And in both cases, having more cores at a lower clock ended up being more future-proof than having fewer cores at a higher clock.

Those people who thought they were being smart buying i5-4590K instead of i7-4790K? They got fucked because it turned out games ended up getting multi-threaded a lot more quickly than anyone expected and 4c/4t without HT was like running the race with only 1 leg. The people on 4790K's today are starting to go through the same thing, as 4c/8t is starting to reach it's limits and most people buying into a gaming ecosystem are thinking of what's next, not what's now.

So let's use our galaxy brains and think about this a moment.


8bcb71a0-71a1-46c4-a9e5-280aa97bc460.png


Now you look at this and you think to yourself, OH MY GOD AMD GETS KILLED BY INTEL STILL IN GAMES I'D BETTER BUY A 9900K RIGHT NOW

But then you realize that you haven't played a game at 1080p on your fire-breathing LED light show Master Race God Machine in like 8 years now. And then well...

3791b98d-ba87-4fc2-9a3d-72b63e97b198.png


Well...that seems a bit more realistic. Yeah. And then if you realize that you haven't been a resolutionlet for the past 4 years and you're gaming in 4K like I am, this is what you see:

fdcfd91c-82a7-4f7b-9347-e577f18e01d9.png


Hexus.net are the only people crazy enough to "benchmark" CPU's at 4K resolution, so don't blame me for this. I didn't think of using a 100% GPU-bound situation to prove a point, but it's proven nonetheless.

What I'm saying is that in any real-world situation, more likely than not you are GPU-bound to some extent and the tests in 1080p where the 9900K kills the 3900X are only relevant if you are one of those CSGO players who swears you need to be at 400 fps or you can't win. Because it's literally irrelevant to everyone else who is a PC gamer.

But then you add in doing actual other stuff while gaming, and then the picture looks more muddled.



So let's look at gaming while streaming, an example making the CPU do more than just run the game.

hDoSdxl.png


Wow, huh. The 9900K's entire lead against the 3900X just evaporates when you make it stream your gameplay at the same time you are playing the game. That's a big hmmm. (Of course, if you are truly a galaxy brain, you are using the NVENC encoder on Shadowplay or OBS for your streaming in which case your result is that bar at the top. But I'm just demonstrating what happens when 12c/24t takes on 8c/16t and doing more than just playing the game.)

Also, for people who think they are smart by buying a 9700K instead of a 9900K or 3900X, well....you're making the same mistake those people who thought they were smart buying an i5-4590K instead of an i7-4790K back in the day.

QthAQfE.png


Here we can see the 8c/8t 9700K turning into a stuttery mess when trying to play a game and stream it at the same time. The 8c/16t 9900K and the 12c/24t 3900X are casual as fuck doing this. Don't be a brainlet and buy the 9700K instead of the 3700X, 3900X, or 9900K and think you're future-proof because you're not.

So in summary, what have we learned?

(1) More cores + more threads at a lower clock speed is more future-proof than fewer cores + fewer threads at higher clock speed
(2) Real-world gaming consists of way more than looking at bar charts
(3) Smart people always think about the future 1-2 year down the road, not right this second when investing in a new gaming machine

If we take these lessons into consideration, it's fairly clear who is more future-proof here despite losing at bar charts in 1080p.


1) more cores and threads are only useful to a certain extent and after it it actually becomes something you don't want for gaming. There was no point in buying 8/16 cores in 2010 for example.

2) real world gaming is just booting up a game and play it. That's how benchmarks get done mostly. Unless they run a scripted enviroment but that just makes it more easier to compare. So dunno what your point is here.

3) If you thinked 1-2 years infront of you, u would buy a cheap ryzen now and upgrade to the real 16/32 core version next year when the 4000 series comes out most likely or when consoles launch. Or go for the fastest 8/16 core and have a blast with gaming for the next decade. Or even better wait until next gen consoles drop and be ready for the next 7 years without any effort.

4) Ur 4k benchmarks make no sense at all. GPU bottlenecks, still makes the CPU faster u only don't see it now. Just wait until you run into a super taxing cpu area and u will now see a dip in the lower 50's. Like you mentioned "look down the road" u clearly forgot about that here.

Also why not do a 100k benchmark and compare it with a pentium 2, every CPU is now 1 fps. They all the same boys.

5) 9600 is a shit chip i agree with that, 9700 is not, its just way to expensive, should have been sold for 159 as entry chip with a lower clock speed, or 199 with the clock speed of 5ghz.

12/24 thread will most likely not see much use. Half steps never do. And it's high unlikely PC games will saturate even 8/16 chips through its generation of next consoles. PC isn't getting dictated by consoles on this front even remotely. Average gear people have is what dictates games are going to run at. The faster cores is what is more important at the end for this but yes you need a x amount of cores how many gets decided by the market and that's most likely going to sit around 8./16 solution.

3900x is a chip that really has no place. If you want better workloads why not just opt for threadripper that are build just for that. Performance on cores is king in PC gaming and always will be. that's also why you see people upgrade there ryzens to new ryzens just for that.

That's why i don't see much future proof in 3000 series chips when the real ryzen chip is going to launch on the 4000 series which is easily replaceable on ryzen boards. Going for top end CPU right now seems kinda void.


6) i agree with your first part tho, thats pretty much what i do.


My advice to people:

Low / mid end = ryzen 3000

low end = 3600
mid end = 3700(x)

high end:

9900k

If you want to upgrade because old pc but want something that really last you for 10 years easily

Buy 3600 ryzen, good x570 motherboard + memory, wait on 4000 series, upgrade once 4000 series hit towards 16/32 core. The full blown ryzen core. or whatever is 8/16 with highest single core performance.

Depends of target resolution, I could only recommend the 9900k to people with 1080p high refresh rate monitors
For 1440p and above 9900k holds a insignificant advantage, you'd be much better served buying a 3700x and a better GPU or SSD with the extra $170 plus cooler money

btw have any tech sites done OC of 3700x with those beast Noctua coolers?

Cpu's don't get any slower at higher resolutions mate. Just GPU's.

That's why people test at lower resolutions to see what future GPU solutions do on those CPU's.

With faster new gpu's u will see the same difference with those cpu's as on the 1080p charts.
 
Last edited:

SonGoku

Member
Cpu's don't get any slower at higher resolutions mate. Just GPU's.

That's why people test at lower resolutions to see what future GPU solutions do on those CPU's.

With faster new gpu's u will see the same difference with those cpu's as on the 1080p charts.
A 9900k will become obsolete long before we ever see a GPU leap so big it makes 1440p/4k as fast as 1080p with a 2080Ti
Not to mention that as more capable GPUs become available, games GPU requirements continuously go up they don't remain static.

You are also conveniently ignoring the trend for next gen is multi threaded performance, therefor a 3900x is more future proof than a 9900k
 
Last edited:
Recommending the power hungry and inefficient 9900K when the 3700X and 3900X have just been released is stupid and must be some kind of pathetic buyers remorse externalized into some form of coping mechanism.

Both are the same price and have the same gaming performance at 1440p/4K:

3900X

- Draws less power at load despite +4 cores
- Up to 45% faster in productivity real world performance
- On a more modern platform with PCIe 4
- AM4 has more longevity (Ryzen 4000 compatible)
- Higher rated memory speeds
- Doesn't require you to buy expensive cooler

9900K

- 5% faster only when you game at 1080p with a 2080 Ti.

5% is nothing even under these unrealistic settings. That's literally its only 'advantage', which is what's desperate about the whole thing.

Now if you compare to the 3700X, the proposition is even worse for Intel.
 
Last edited:

llien

Member
I hope things will change. If I go with a AMD option I can expect a 30-40% performance drop in After Effects compared to Intel.
butwhy.gif?

BZhgLvR.png



Hexus.net are the only people crazy enough to "benchmark" CPU's at 4K resolution
TPU does that too.

relative-performance-games-38410-2160.png


I understand, but intel and nVidia are much more efficient with power than AMD
Dude, 9900k consumes nearly twice the power of 3700x, to beat it at those couple %. in certain tasks.
Even the first Zen beat Intel at perf/watt.
 
Last edited:

Spukc

always chasing the next thrill
Maybe ryzen 4 will beat intel 9900k not that it matters as intel 10 series prolly be a lot better again.
How boring
 
Top Bottom