• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

JayzTwoCents: Intel cant beat AMD anymore (Rocket Lake review)

Given what they cost at microcenter these days I'd think the best bet would be either a 10700k or a 10850k.($250 and $320 respectively.)
Picked up the 10900KF for 330$ from MC with an open box strix z490 for $230. I'm a happy camper. The 5600x would be a bit better single core, but I figured the 10900KF would age much better with 4 extra cores and I plan on closing the gap with a 5.2ghz overclock and some 4400mhz ram. Also figure I can resell the 10900k for around the same price within the next 2 years, should hold its value well. Either way its good to have some real competition now!

Does seem pointless buying into RL when you can get CL stuff for so cheap right now. Microcenter prices are insane right now. Hopefully Adler Lake picks up the pace and gives Intel some solid IPC gains that benefit games.
 
Last edited:
Last time AMD beat Intel, Intel's retaliation left AMD behind for 10 years basically. The cycle continues.

It's no doubt that AMD is the new king of the hill, and I purchased my first AMD CPU a few months ago for a new build, but I'm still optimistic about Intel's projections for the long-term. I even purchased stock, just to keep myself honest. They're going through a reorg, so let's give them some time to prove they can compete at a higher level again.
 
For gaming Intel already does, interesting results



That was a 5800h though that not even the most powerful chip for AMD. That's why I said the 5900hx. Also the intel has 50% more heat pipes than the amd so the cooling is making the performance better more than anything.
 
Yeah... I would not poke Intel with a stick too much. Considering how long they squeezed 14nm, and it took AMD 3 generations of ZEN to finally be above, I can't imagine what happens when Intel goes TSMC 5nm. Right now they can't.
Because AMD will of course do nothing and remain static...
 
That was a 5800h though that not even the most powerful chip for AMD. That's why I said the 5900hx. Also the intel has 50% more heat pipes than the amd so the cooling is making the performance better more than anything.
And thats not even a 11875h either.
The CPU's are swapped in each chassis and temps in both didn't reach the thermal throttling limit in each laptop , so cooling was not really a issue.
 
Intended to get a 11900K. Went shopping for z590 mobos at Microcenter last month and there sat a 5950X. I thought "you know, I'm an Intel guy, but this thing is here...."

I'm watching Gamers Nexus' review right now and hoo boy I made the right call.
 
Last edited:
Once I though that AMD going fabless was an huge mistake. Silly me.

Intel is in a position that they deserved to be, minimal upgrades on each iteneration, frequent socket changes, 4C8T for several gens while milking HEDT with higher (but much more expensive offerings) offerings in terms of cores.

People forget the tiny amount AMD puts (compared to Intel) into R&D and coming from Bulldozer there is no denying that Ryzen is an amazing achievement and something Intel didn't expect and were caught with there pants down, and by this i mean being stuck forever in 14nm.

Glad AMD is in a much better position, or should I say, the best option for almost 99% of the scenarios on desktop, and alone HEDT which Intel don't have any chip to compete.

Sure Intel will recover once they sort out their 10nm for desktop but when that time comes AMD will probably be on 5nm, but again, Intel resources are much higher than AMD ones so that's expected.

It's true they got complacent. But if they went all out against a weakened AMD wouldn't they have been smacked down by antitrust lawyers?

Plus, in addition to Alder Lake, Intel just hired Patrick Gelsinger as new CEO. Don't know what will be done about future CPUs, aside from what I hear about opening their own foundries to other CPU producers, and also licensing out their own designs (a la ARM).
 
Last edited:
They-blow-you-up-today-you-blow-them-up-tomorrow.-Its-just-business.jpg
 
Got an 10400f for 90 bucks while 3600 ist 165 bucks and mostly the Intel performs better in games, so i don't care what that guy is thinking.
 
AMD is using 7nm and just barely beating Intel's 14nm. Think about it...

Sure you can day that if you compare 8c vs 8c and ignore power consumption but the reality is AMD can offer 16c at similar power to the 8c Intel and in thread heavy workloads it totally dominates.

On top of that when it comes to HEDT there is only 1 player at the moment which is AMD and they have not released zen3 based ThreadRipper yet either.

Alderlake is supposed to be upto 20% better single thread performance then rocketlake. Considering rocketlake is behind zen3 already and AMD look to be doing a zen 3 refresh Alderlake may only be 10% faster than zen3+ at best and if the 29% IPC gain for zen4 is true (and actual performance may best that since it might clock higher) Alderlake will be miles behind in all metrics.

So Intel 14nm does not really compete at all.
 
Well I hope there will be the Empire Strikes Back of CPU's in the end of this year or early next year. Team Red is the Leader at the moment and they are forcing Intel to sharpen their teeths.
 
The i5 doesn't look that bad compared to the comparable 5600X, but the i9 seems like a joke. It loses out to the previous 10th gen i9 and scores comparably to the 10th gen i7, which seems reasonable since they both have 8 cores. Then again the i9 is much more expensive. If you are dead set on Intel then there is no reason to go with the 11th gen i9 and it's much better to go 10th gen.
Otherwise AMD is your best bet. I'm still surprised that AMD hasn't released their non-X SKUs, but considering the chip shortage, it makes sense that they aren't releasing cheaper SKUs yet.

I'm happy I got my 5900X at MSRP. (Even came with a copy of Far Cry 6, whenever that releases.)
 
I wouldn't bank on Intel turning it around anytime soon. They have completely and spectacularly failed trying to implement the last two generations of Intel's manufacturing process, costing them billions and putting them years behind their competitors. There has been article after article over the past year about how they are considering outsourcing their flagship cpu manufacturing to an already capacity constrained TSMC and at this point I don't see them as having any other option. They already sold their SSD/Memory division to SK Hynix.
I have not read anything that said what they planned to outsource, just that they planned to outsource some chips. It could be the laptop chips. I find it hard to see intel not driving to make their best, high margin chips on their best process and outsourcing the lower priced commodity CPUs to TSMC. Intel is also opening a foundry business so others can outsource through them.
 
So paying extra now allows you to use faster SSDs without replacing the CPU down the line, and means you will be able to continue to upgrade the GPU without worrying about losing any performance to PCIe bottlenecks (gap is only a percent or two now, but may grow).
 
Last edited:
AMD is using 7nm and just barely beating Intel's 14nm. Think about it...

The fact that they best them in multi-threaded applications and at much less power consumption. Also they are doing so without needing to reach 5.0ghz+ which causes heat issues.

5nm cpu's are coming off of new design, and will better utilize things implemented like smart access memory and better controller for chipset such as pci express 4.00+. They have the core counts now it's making them more effecient, and use memory bandwidth in tangent with dedicated or onboard GPU'S. AMD is more forward thinking in what to design. More cores with less latency and faster memory access leads to better real world results.

And if you test most new games, intel's edge is even less than it use to be. Sure intel has ipc edge on some games that were either old to begin with or were designed on old engines. Anything coming out now and down the line is being developed under zen specs. So over time the gap will increase between.

Which jay shows with adobe.
 
Last edited:
So paying extra now allows you to use faster SSDs without replacing the CPU down the line, and means you will be able to continue to upgrade the GPU without worrying about losing any performance to PCIe bottlenecks (gap is only a percent or two now, but may grow).

Hahaha.
Stop drinking the koolaid son.

By the time we have a GPU that saturates PCIE3 your CPU will likely be the bottleneck anyway.
PCIE3 x 16 has a limit of 16GB/s.....unless you are playing at 300fps and/or 4K144 max settings there is no difference and there wont be one till post Hoppers RTX 5000.

As for needing a PCIE4 SSD, depending on your workload thats not a 400+ dollar benefit [CPU + SSD] be real.

P.S Youll likely be upgrading to a PCIE5/DDR5 motherboard before you even realize you "needed" PCIE4.

Intel-600-Chipset-Specifications.jpg
 
Last edited:
Hahaha.
Stop drinking the koolaid son.

By the time we have a GPU that saturates PCIE3 your CPU will likely be the bottleneck anyway.
PCIE3 x 16 has a limit of 16GB/s.....unless you are playing at 300fps and/or 4K144 max settings there is no difference and there wont be one till post Hoppers RTX 5000.

As for needing a PCIE4 SSD, depending on your workload thats not a 400+ dollar benefit [CPU + SSD] be real.

P.S Youll likely be upgrading to a PCIE5/DDR5 motherboard before you even realize you "needed" PCIE4.

Upgrading to a PCIE5/DDR5 motherboard means a new CPU and new memory as well. So it's a big saving if you can continue on your old platform. And if a 5800X is sufficient for 60 FPS over the course of the generation (console games seem to be targeting 60 FPS in specific modes), then you could last 5+ years, with targeted GPU upgrades, if necessary. I just replaced my i5-6600K/GTX 970 system, which was over 5 years old. And this console generation could last for 7 years.
 
Last edited:
Upgrading to a PCIE5/DDR5 motherboard means a new CPU and new memory as well. So it's a big saving if you can continue on your old platform. And if a 5800X is sufficient for 60 FPS over the course of the generation (console games seem to be targeting 60 FPS in specific modes), then you could last 5+ years, with targeted GPU upgrades, if necessary. I just replaced my i5-6600K/GTX 970 system, which was over 5 years old. And this console generation could last for 7 years.
Everything you just said is also true for the 106, 7, 8 and 900K?
 
Everything you just said is also true for the 106, 7, 8 and 900K?
It might be, if 3.5 MB/s is sufficiently fast, once developers start targeting the 4.8GB/s base speeds of the XSX. And assuming Hopper performs just as well on PCIe 3.0, if you need to upgrade the GPU (eg. moving up to 4K60 with ray tracing maxed out).
 
It might be, if 3.5 MB/s is sufficiently fast, once developers start targeting the 4.8GB/s base speeds of the XSX. And assuming Hopper performs just as well on PCIe 3.0, if you need to upgrade the GPU (eg. moving up to 4K60 with ray tracing maxed out).

Final Flash PCIE4 SSDs wont have the market penetration needed for devs to "Target" it anytime soon.
4k60 is easy work at 16GB/s......if Hopper is powerful enough to run max settings 4k144 PCIE3 might be the 10 series downfall.......at 4k144.
4k60? Whatevers after Lovelace and Hopper maybe it will "need" PCIE4 to run at "regular" res/framerates.
 
That's all any of us should care about in this market....

My old flatmate bought a 1440p 144hz monitor to play Warzone on, he was so excited and then when he got it he couldn't even get to 100fps at 1080p with the settings down at minimum. I'm not sure if there was another bottleneck somewhere that was stopping him but I felt bad for him, I'll just stick to trying to get 60 fps at 4K, which I rarely can these days on a GTX 1080.
 
I have not read anything that said what they planned to outsource, just that they planned to outsource some chips. It could be the laptop chips. I find it hard to see intel not driving to make their best, high margin chips on their best process and outsourcing the lower priced commodity CPUs to TSMC. Intel is also opening a foundry business so others can outsource through them.
This is an older article, but there are more recent ones.

The truth is Intel delayed 3 generations of chips through a succession of manufacturing failures. Which is why their stock shed $40+ billion in value, why Apple is shifting its macs away from Intel to chips made by TSMC, and why NVIDIA and AMD(which both outsource to TSMC) are gaining in areas Intel used to dominate.

Maybe this new CEO will turn it around eventually, but right now they are way behind and are playing catch up.

Intel contemplates outsourcing advanced production, upending Oregon's central role

Now, Intel is laying the groundwork to toss the old model out the window. It is openly flirting with the notion of moving leading-edge production from Oregon to Asia and hiring one of its top rivals to make Intel's most advanced chips.

The company says a decision is likely in January.

It's a momentous choice that follows a string of manufacturing setbacks at the Ronler Acres campus near Hillsboro Stadium, failures that have cost Intel its cherished leadership in semiconductor technology – perhaps forever.
 
Last edited:
My old flatmate bought a 1440p 144hz monitor to play Warzone on, he was so excited and then when he got it he couldn't even get to 100fps at 1080p with the settings down at minimum. I'm not sure if there was another bottleneck somewhere that was stopping him but I felt bad for him, I'll just stick to trying to get 60 fps at 4K, which I rarely can these days on a GTX 1080.
I feel his pain. I have a 1440p 144hz, I upgraded my PC last year, all except my GPU becuase the 2000 series sucked and AMD wasn't putting out much. Waited on the 3000/6000's and now it's just a crap shoot. Probably be another year or two. Fuck it, I switched to spending most of my time fishing and playing older games. Right now I am running through Baldur's Gate series.
 
The Intel and AMD tribalism is fucking weird.

That being said, I don't really bother with companies that take a long ass time to catch up.
 
This will sell like crazy, because good luck finding an AMD CPU right now.

You mean Intel 10th gen, because no one in their right mind would buy an 11th gen cpu for the prices they ask, what a big lol for the 11900k price when it doesn't even beat the last gen 10900k in gaming let alone in productive tasks.
 
You mean Intel 10th gen, because no one in their right mind would buy an 11th gen cpu for the prices they ask, what a big lol for the 11900k price when it doesn't even beat the last gen 10900k in gaming let alone in productive tasks.
Yes, that too.
 
Sure you can day that if you compare 8c vs 8c and ignore power consumption but the reality is AMD can offer 16c at similar power to the 8c Intel and in thread heavy workloads it totally dominates.

On top of that when it comes to HEDT there is only 1 player at the moment which is AMD and they have not released zen3 based ThreadRipper yet either.

Alderlake is supposed to be upto 20% better single thread performance then rocketlake. Considering rocketlake is behind zen3 already and AMD look to be doing a zen 3 refresh Alderlake may only be 10% faster than zen3+ at best and if the 29% IPC gain for zen4 is true (and actual performance may best that since it might clock higher) Alderlake will be miles behind in all metrics.

So Intel 14nm does not really compete at all.
AMD actually wipes the floor with Intel's offerings, but note that:

TSMC 7nm: transistor size (L1 cache) 22nm x 22nm
Intel 14nm: transistor size (L1 cache) 24nm x 24nm
You guys are completely missing my point here. I know 14mn is dated tech, and so does Intel. It's why my original statement that the article is click bait. BTW bigger nm nodes does not equal better. I also Said to wait for Alder Lake which will be 10nm.

Now if Intel has access to TSMC 7nm right now like AMD I seriously doubt they would be behind. One only has to look at the GPU market and compare Nvidia's rtx 3000 which use 8nm vs AMDs big navi at 7nm to defend my point.
 
Last edited:
One only has to look at the GPU market and compare Nvidia's rtx 3000 which use 8nm vs AMDs big navi at 7nm.
It's funny that you refer to Lisa Su kicking Huang's butt into "drop a tier" (3080 => 3070, 3080Ti => 3080 etc, that is why mem configs are so weird btw) as some sort of "fail" by AMD.

AMD"s GPU chips beat NV's transistor for transistor, so the node point is moot. All it translates into is just power consumption difference. Oh, and they do it while using slower RAM.

Oh, and that is just mere 2 years after embarrassment called Vega (goodbye, Raja, I won't miss you)
 
Last edited:
AMD is using 7nm and just barely beating Intel's 14nm. Think about it...
I've seen this argument over and over but the only reason why they are close is because Intel doesn't give a shit about thermals

Not only Amd beats intel's performance but it does without reaching near 300w(which is the energy consumption you would expect of a server cpu like epyc)

If Amd was as desperate they could juice up the clock and mop the floor with intel but do we really want to normalize those insane thermals on desktop?
 
You guys are completely missing my point here. I know 14mn is dated tech, and so does Intel. It's why my original statement that the article is click bait. BTW bigger nm nodes does not equal better. I also Said to wait for Alder Lake which will be 10nm.

Now if Intel has access to TSMC 7nm right now like AMD I seriously doubt they would be behind. One only has to look at the GPU market and compare Nvidia's rtx 3000 which use 8nm vs AMDs big navi at 7nm to defend my point.
Their 10 nm SF+ process is roughly comparable to TSMC's 7 nm process. And Zen 3 (in its mobile form) is perfectly competitive with Tiger Lake, which is on 10 nm.
 
Last edited:
You guys are completely missing my point here. I know 14mn is dated tech, and so does Intel. It's why my original statement that the article is click bait. BTW bigger nm nodes does not equal better. I also Said to wait for Alder Lake which will be 10nm.

Now if Intel has access to TSMC 7nm right now like AMD I seriously doubt they would be behind. One only has to look at the GPU market and compare Nvidia's rtx 3000 which use 8nm vs AMDs big navi at 7nm to defend my point.

Alder Lake will be slower than the 5950X in productivity workloads and probably the 5900X as well. With the big.LITTLE design there is every possibility it will be fucked by the windows scheduler as well so while I am sure it will be an improvement over rocket lake when only the big cores are in use I am really sceptical how well it will handle moving workloads to the correct core type at the correct time.
 
Top Bottom