• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Core i9-10900K benchmark

Armorian

Banned
First they leaked coronavirus now china is leaking Core i9-10900K benchmark:

core3.jpg

core4.jpg

core5.jpg

core6.jpg

core12.jpg

core14.jpg


Power consumption:

core20.jpg


10 core Intel eats more than 16 core Ryzen. Gaming benchmarks are in 1440p so maybe not so great for CPU test.
 

Ogami

Neo Member
Yeah, i would take these "leaks" with a barrel full of salt at this point.
We will get more reliable benchmarks soon enough.
 

Armorian

Banned
Yeah, i would take these "leaks" with a barrel full of salt at this point.
We will get more reliable benchmarks soon enough.

It has the same Skylake core as 6700K from 2015, there is nothing minblowing here to discover aside maybe from power consumption. Slightly better in games (should be more visible in 1080p benches) worse in everything else,
 
I've already switched to Ryzen and am not interested in anything Intel has to offer.

Intel price gouged us all for years, offering the absolute minimum 5% garbage improvement year on year... for years... stuck at 4 cores... and then stuck at 14nm.

Intel will have to completely change their ways, for years, before I would even consider switching back.
 

skneogaf

Member
Intel is still the cpu to buy if you just the most fps in video games.

If you do absolutely anything else then I think amd is the best option.

I've recently read that if you play games and stream your game play then and is the best too so Intel is only for pure highest fps.

Hopefully the amd 4000 series will continue the path that amd has been on which should keep up with these new Intel 10 series
 

nkarafo

Member
My PC is about 40% modern games and 60% emulation. So i'm still conflicted on what CPU my next build will have. Though, 14nm is bullshit.
 

Rentahamster

Rodent Whores
I’m pretty set on upgrading to a 3900x. I mean, I already bought the mobo, but I’m still interested in seeing how this compares for gaming
They're at the lowest price I've seen on Amazon yet.


Four Hundred Nine US dollars /kazhiraivoice
 
Minor IPC gains, somewhat better performance in heavily threaded applications, somewhat faster at warming up your room on a cold day.

I wasn't expecting much from the 10 series, as we are of course on another 14nm refresh. If Intel priced it at say $350-400, it would be a good buy, but It's intel, so it has to be about $500-ish msrp and probably $550 retail after price gouging.
 

Paracelsus

Member
Minor IPC gains, somewhat better performance in heavily threaded applications, somewhat faster at warming up your room on a cold day.

I wasn't expecting much from the 10 series, as we are of course on another 14nm refresh. If Intel priced it at say $350-400, it would be a good buy, but It's intel, so it has to be about $500-ish msrp and probably $550 retail after price gouging.



570-599€ in the old continent.
 

Dr.D00p

Member
My PC is about 40% modern games and 60% emulation. So i'm still conflicted on what CPU my next build will have. Though, 14nm is bullshit.

Emulation is the reason why I stuck with Intel when building my new rig last year, I would have liked to switch to a Ryzen 3800 or 3900, but the 9900K is still the superior choice for emulation.

I can either run it as a full fat 9900K @5GHZ or turn off Hyper Threading, effectively turning it into a 9700K, which can then run at a stable 5.2GHZ, which is great for emulators that only use up to 4 cores, Like MAME.

Either way of running it, with or without Hyper Threading, is more than enough for any modern game out there as well.

If what they're saying about Ryzen 3 is true though, I'll most likely make the switch to AMD for my next build.
 
Last edited:

Arun1910

Member
I was looking to upgrade to this but figure I may aswell get a Ryzen 3950x or just wait for the 4000 series. I only want the upgrade around September.

I'm pretty new to PC building but I can see that Intels price to performance ratio is crappy.
 

ZywyPL

Banned
Don't buy 14nm this late in the game.

I don't know, depends on the purpose, but talking strictly about gaming, IMO anything above 4C/4T is completely irrelevant as long as games are being designed to run on PS4/XB1, even a decade old i5 2500K easily runs games at 60+ FPS, while modern quad-core CPUs easily allow for 100-120FPS, and there's really not much, if any more utilization above that, 6C/6T is as far as you can go if you really want to saturate a 144Hz display as much as possible, but from there jumping to 8, 10, 12, 16 cores gives just a single FPS gains, it's completely not worth it.

So Q4 2020/Q1 2021 will be a real test of what those CPUs are really capable of, but I personally don't think 9900K @5GHz will have any problems in running the games of the upcoming generation, games will be mostly, if not solely GPU bound. Unless 16-32 cores will become what is really needed to enjoy high frame rates on PC, then I can't see Intel being able to even pick up a fight, those 10 cores are already stressing the 14nm process way too much if you ask me, I cannot see them being able to add any more cores unless downclocking to 4-4,5GHz, which we all know most likely will never happen. And as far as I know they still have issues with rolling out 10nm desktop lineup, and if it will be any good.
 

Dr.D00p

Member
but I personally don't think 9900K @5GHz will have any problems in running the games of the upcoming generation

The 9900K (or Ryzen 3700/3800) will be fine for the duration of the new console lifecycle without any problems whatsoever, it won't even need to be run at 5Ghz either, stock settings will still make it a faster CPU than what's in either next gen console, and remember for gaming, they will have only have 7 cores 14 threads available.
 

Soodanim

Member
When I look at PC components, I look at things like power consumption and value for money as well as performance. If there's any truth to these, it just means the current trend is continuing - AMD have lower power consumption, more cores, and more competitive price points while Intel still has the single core/per clock advantage. That'd have to be a big advantage to outweigh everything else at this point.
 

INC

Member
When I look at PC components, I look at things like power consumption and value for money as well as performance. If there's any truth to these, it just means the current trend is continuing - AMD have lower power consumption, more cores, and more competitive price points while Intel still has the single core/per clock advantage. That'd have to be a big advantage to outweigh everything else at this point.

But with next gen both being amd, arent most games gonna be optimised with them in mind?
 
Last edited:

Orta

Banned
I've never really understood cpu benchmarking with games. I always look at the benchmarks in tandem with GPU's which I always assumed carry out 90% of the pretty things we see on our screen.

For example, the scores in the OP, Far Cry 5 running @1440p gives you an average or max frame rate of 135fps? Is that test carried out minus a graphic card and relying instead purely on the integrated graphical capability of the 10900k?

As I said, never put any real pass on CPU & graphical benchmarking so likely this question might not even make sense. But if a cpu alone can give you that kind of frame rate in a modern game, why the need for a gpu. Is it a case of the CPU giving stable performance minus all the bells and whistles a GPU gives?
 

Dr.D00p

Member
I've never really understood cpu benchmarking with games. I always look at the benchmarks in tandem with GPU's which I always assumed carry out 90% of the pretty things we see on our screen.

The GPU can only draw as fast as the CPU can send data to it.
 

Thaedolus

Member
Just another reason to go AMD, Intel has a hell of a lot of work to do.

I don’t know that the “optimized for AMD” thing carries a lot of weight when the performance of both top Intel and AMD chips are likely to be so close as to be inconsequential in most cases, which is why the performance per dollar argument can be singled out as the most important factor for most people.

And if modern AAA titles are pretty much neck and neck, but emulation is clearly better on AMD, then I know I’m finally switching back to AMD on my next PC refresh.

Nvidia is still putting out expensive yet baller hardware/innovative tech on the GPU side, so it’s a much different argument. Intel has seemingly squandered their previously dominant position
 

Soodanim

Member
I don’t know that the “optimized for AMD” thing carries a lot of weight when the performance of both top Intel and AMD chips are likely to be so close as to be inconsequential in most cases, which is why the performance per dollar argument can be singled out as the most important factor for most people.

And if modern AAA titles are pretty much neck and neck, but emulation is clearly better on AMD, then I know I’m finally switching back to AMD on my next PC refresh.

Nvidia is still putting out expensive yet baller hardware/innovative tech on the GPU side, so it’s a much different argument. Intel has seemingly squandered their previously dominant position
The optimised thing was a "What if", I don't actually think they will be outside of consoles developing to use every core available.

And I definitely wasn't advocating AMD GPUs, I still never hear anything about their drivers being any good and I've never owned one of theirs.
 

Thaedolus

Member
Intel is still superior for emulation and Nvidia drivers are better than AMD's on the GPU side of things.

there should've been an if in my sentence..."but if emulation is clearly better on AMD..." and I was going off the CEMU videos up there. And yeah, I'm likely sticking with Nvidia on the GPU side and going for a 3080ti whenever they come out, currently on a 1080ti.
 

longdi

Banned
Interesting quote for the console fanboys. :messenger_grinning_sweat:

Note, 254 W is quite a lot, and we get 10 cores at 4.9 GHz out of it. By comparison, AMD's 3990X gives 64 cores at 3.2 GHz for 280 W, which goes to show the trade-offs between going wide and going deep. Which one would you rather have?

116012.png
 


With all ten cores running at 5.3GHz, and under-volted, the flagship Comet Lake chip sure draws a lot of juice

We overclocked our Intel Core i9 10900K CPU to 5.3GHz on all cores and it drew a massive 331W of power at peak. That's the processor alone, by the way, not the total platform power draw of our testing rig. That's almost two Xbox One X consoles running at max just to run the new Intel CPU.

But it's also a rather power-hungry CPU when it's running at peak. Out of the box, with all ten cores and 20 processing threads at 100% performance, and at its turbo boost of 4.9GHz it will draw almost 200W. In itself, that's pretty demanding. But when you're gaming, and it's not running every core to its fullest—bouncing around 5.2GHz and very occasionally 5.3GHz—it actually runs impressively cool at 58C and draws 128W.
 
Last edited:

b0uncyfr0

Member
No point in buying this - AMD's 4xxx series will close the gap again in a few months and you wont need to buy Intel's chip unless you want the bestest of the bestest of the bestest.
 

Reallink

Member
No point in buying this - AMD's 4xxx series will close the gap again in a few months and you wont need to buy Intel's chip unless you want the bestest of the bestest of the bestest.

The 4000s are supposed to have significant IPC gains (nearing 20%) and minor clock gains. Taken together they should finally beat Intel across the board in gaming and everything else with no caveats or qualifiers. The bestest of the bestest will be AMD (possibly significantly so) if the rumors are to be believed.
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
New intel CPU estimated to be $500, getting creamed in all but the most singlest core tasks (in an age where both rival companies invest in more and more cores and threads mind you) by the 3900X, a $400 CPU.

Pretty sad news for Intel because despite it winning the single core point, it's only winning by like 5%, while the AMD chip is killing it with like 20% better scores when it is scoring higher.

The 10900K is solid for Photoshop though, by a wide margin. It fucking loves power though. You'll need 300W for that chip. The AMD 3950X is at 240 for productivity tasks.

For gaming the Intel chip is quite the winner until you get to the GPU limit. From 1440p and upwards they are basically identical but it is the world's fastest gaming CPU.

So, basically, expect Leonidas to make another "Intel still the gaming king" thread when this thing launches, lol. If you only game and you want to eek out 15 more fps at 1080p (so 240 fps instead of 225) the Intel is your place to be.

Though the Intel has a few fps over the AMD at 1440p still we're talking like 5 fps and it'll vary by game, with Ryzen popularity there won't be a big gap in optimization for them any more.

Edit: I'm paraphrasing a discord conversation with a game developer (programmer who has worked with a variety of engines the last couple decades) friend of mine, in case the usual trolls - with no arguments - try to claim anyone who says this is just talking ignorant shit, lol.

As if I have a stake in either company unlike the likes of Leo & Ken, I've never even bought an AMD CPU but obviously things will change by next year as I'm due a system overhaul if Intel keeps being shit like this.
 
Last edited:

Rentahamster

Rodent Whores
The only "I need this" scenario that is apparent to me at the moment is for gamers who game at low resolutions and want the highest FPS possible for the best motion clarity on high refresh rate monitors. Other than that, I'm not sure what is the compelling reason for the average consumer to pay more money and electricity for it.
 
Top Bottom