• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel: future CPUs to be slower but more efficient

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Moore's Law is about transistor density and cost, it does not govern how those transistors are used in processor design (performance vs. power).
I thought I was the only one who saw the article's claim did not meet the article's content.

That said, Moore's Law is coming to a halt with silicon.
 

kyser73

Member
This is depressing if true. Can't they go for more cores or combining several CPUs on the same mother board or something like that? I never thought I'd see the day when computer tech stopped evolving toward better performance. How do we go forward now? Better programming? Faster memory?

It is still evolving, we're just looking at the end of one very long branch.

Optical-electronics using photons rather than electrons are finally showing signs of being viable outside a lab, with at least 3 companies claiming to be able to create optronic chips on silicon using existing lithographic processes.

There are also potential materials innovations that can deal with the heat issue.

Just because an industry giant says something is happening doesn't mean it is, either. Intel has been completely incapable of penetrating the mobile market, and has a business that is built around a specific production & design process.

Changing that isn't easy: it's expensive, risky and has to happen in an environment where financial reporting & board action is driven by quarter, not one given to changes in direction that can take a decade or more.

Tl:dr it isn't depressing, and will likely force innovation in other areas.
 

squidyj

Member
So is the future dead?

I mean. Where do we go next?



Surely this is not the pinnacle of computing.

People are checking out Graphene but it's expensive to make a chip and it might be difficult to get it to do logic too,

If we can get everything to work out it can be massively massively faster than silicon. but I wouldn't expect to see products any time soon.
 

Overside

Banned
I don't know enough about Artificial Intelligence or the approaches for it (neural networks, deep learning) to say for sure, I just brushed up on the topic over time. However I think that on some level, some semblance of intelligence is possible. But my thinking basically boils down to this: Given enough bits, you can simulate a synapse, neuron, etc. Why would you not be able to simulate the brain given enough computing power? True intelligence or a computer that is able to reason beyond our capacity may be a stretch, (however at its core they would be inherently better even on a human's level due to the fact that they remember precisely, don't fatigue or lose focus). But I don't see how intelligence on some level is impossible for a computer. Given enough bits. (Which is a big given in and of itself).


Your bitter is still awful. Thats not bitter at all. Its pleasant. All wrong.

The main problem right now is, despite all the knowledge we have on the (brain) matter... we actually really have no complete idea of how a brain works... So... We dont know how to make or simulate an artificial one... No matter how much processing power we have....


So on the bright side, even if processing power stagnates for a bit... Its not whats hindering ai right now.
 

KKRT00

Member
My 2500K at 4.6GHz will never have to be replaced, will it?

I dont understand those posts. Are You not playing demanding games on Your PC, because 2500k is already showing its age in games.
8 threads are a minimum for high performance gaming right now. I could understand 2600k@4.6, but 2500k is getting really old.

---
I just upgraded from a 3570k to a 5820k and saw massive gains. I think the 2500k is adequate but outdated. My workflow has improved greatly with the new CPU and a lot of the slowdown I was getting in modern games has cleared up without changing my GPU.
Yep.
 

jeremiahg

Neo Member
It sounds like some are misinterpreting Intel's statement as an indication that computer speed will not improve. My interpretation is that conventional processor technology has reached its peak. So Intel has to go back to the drawing board to explore fundamentally new technologies that will take a while to catch up to the old ones, but will eventually surpass them. Moore is just taking a break until they mature.
 

Skittles

Member
If we ever hit 5nm, I'm pretty much going to need tweezers to install the cpu :p

If they want faster speeds, shouldn't they focus on heat dissipation?
 

Renekton

Member
It sounds like some are misinterpreting Intel's statement as an indication that computer speed will not improve. My interpretation is that conventional processor technology has reached its peak. So Intel has to go back to the drawing board to explore fundamentally new technologies that will take a while to catch up to the old ones, but will eventually surpass them. Moore is just taking a break until they mature.
At least some of us interpret it this way:

Computing power will still improve, but this may require stacking more chips to take advantage of their better power efficiency, rather counting on forever increasing IPC+clock per core.
 

Overside

Banned
At least some of us interpret it this way:

Computing power will still improve, but this may require stacking more chips to take advantage of their better power efficiency, rather counting on forever increasing IPC+clock per core.

You have encountered a Wild Amdahl.
 
I just upgraded from a 3570k to a 5820k and saw massive gains. I think the 2500k is adequate but outdated. My workflow has improved greatly with the new CPU and a lot of the slowdown I was getting in modern games has cleared up without changing my GPU.

Hard to say whether your improved gaming performance is due to CPU or something else. At the same time, you changed your motherboard, RAM and possibly did a new system install. Without having a clear A/B comparison and benchmarks, your perceived improvement could thus be due to other factors or just be pure placebo effect.
 
Sadly, that's how I see it. As long as Intel is on top of AMD, Moore's law won't advance without incentives.

No. Intel puts an enormous amount of time, money, and effort into R&D, far more than AMD could ever hope to imagine.

As has already been stated, we're already approaching the physical limitations of silicon chips. There's a bit more room to go, but the closer we get the harder it gets to progress any further. It's going to be significantly harder and more expensive to eek out any more appreciable gains, so Intel is redirecting their efforts towards efficiency as they spent more of their time and money on new technologies rather than wringing silicon dry. They are innovating, just in areas that are going to take longer to payoff, but that's necessary in the long-run.
 

Kezen

Banned
I don't think performance will stall technically speaking, but the focus clearly is on power efficiency. In pure gaming workloads modern CPUs already do well, and DX12 will only improve that so no worries.

Hard to say whether your improved gaming performance is due to CPU or something else. At the same time, you changed your motherboard, RAM and possibly did a new system install. Without having a clear A/B comparison and benchmarks, your perceived improvement could thus be due to other factors or just be pure placebo effect.
Look up various CPU benches in AAA games and you'll see what he is talking about.
 

Dryk

Member
I've never felt held back by my stock 2500K. Guess I'll stick with GPU upgrades as long as my motherboard and RAM hold out.
 

SRG01

Member
Aren't they trying to figure out something else out of which to build processors?

I have some exposure to the industry, and suffice it to say, CMOS (on silicon) is the elephant no one can dislodge.

To give an analogy: there was speculation that the industry was going to move onto X-Ray lithography over a decade ago because foundries were hitting the limits of UV. And they're still on UV, because the cost of retooling (and the special materials involved) is just too high.

Moving away from CMOS would take so much upfront capital that no one would be willing to take the risk.
 
Look up various CPU benches in AAA games and you'll see what he is talking about.

No, I don't. Have a look here (sorry for the German website, but they do run some of the most comprehensive CPU tests, and the bar graphs are understandable without German

http://www.computerbase.de/2014-08/...0x-haswell-e-test/5/#abschnitt_spiele_full_hd

There is very little difference between CPU influence on FPS at high resolutions since the 2500K. And even at low, purely CPU constrained resolutions, the difference is only 25%.

If one wants to make an argument about new CPUs improving gaming performance (especially considering most of these older i5/i7s run at 4 GHz+), one would have to look at stuff like microstutters. Because clearly, on average, the influence of the CPU on FPS is very low since the 2500k@4GHz came around.
 
Your motherboard will eventually die. Then you'll see that there's no replacement motherboard for that socket.

You will have to upgrade.

Had it happen to me recently managed to pick up a decent Gigabyte UD7 for my 2700k got it for a good price too off ebay. Give it another year and you'll be fucked though. At least for good motherboards.
 

Kudo

Member
Good, maybe my i7-6700k will now last at least the 5 years I planned it to last when I bought this computer.
Hopefully games start to utilize multicores more in future, might get Pascal for this machine and it's perfect for 1440p gaming.
 

DonMigs85

Member
No, I don't. Have a look here (sorry for the German website, but they do run some of the most comprehensive CPU tests, and the bar graphs are understandable without German

http://www.computerbase.de/2014-08/...0x-haswell-e-test/5/#abschnitt_spiele_full_hd

There is very little difference between CPU influence on FPS at high resolutions since the 2500K. And even at low, purely CPU constrained resolutions, the difference is only 25%.

If one wants to make an argument about new CPUs improving gaming performance (especially considering most of these older i5/i7s run at 4 GHz+), one would have to look at stuff like microstutters. Because clearly, on average, the influence of the CPU on FPS is very low since the 2500k@4GHz came around.

I currently have a GTX 960 paired with an ancient Core 2 Quad Q9550. If I upgraded to at least a Core i5-4590 or 6500, I don't think I would ever be CPU-limited again in the vast majority of games right? I only game at 1080p.
 
Very interesting, in a way it'll be nice to not feel upgrade pressure, but on the other hand it's nice to upgrade! ;). What games out there currently saturate all 8 threads of an i7?
 

Kezen

Banned
No, I don't. Have a look here (sorry for the German website, but they do run some of the most comprehensive CPU tests, and the bar graphs are understandable without German

http://www.computerbase.de/2014-08/...0x-haswell-e-test/5/#abschnitt_spiele_full_hd

There is very little difference between CPU influence on FPS at high resolutions since the 2500K. And even at low, purely CPU constrained resolutions, the difference is only 25%.

If one wants to make an argument about new CPUs improving gaming performance (especially considering most of these older i5/i7s run at 4 GHz+), one would have to look at stuff like microstutters. Because clearly, on average, the influence of the CPU on FPS is very low since the 2500k@4GHz came around.

The CPU does matter a lot at 1080p. The 2500K runs at 3.2ghz stock if memory serves me right.
http--www.gamegpu.ru-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-d_proz.jpg

The Division is no exception, feel free to educate yourself.

The 2500K can easily be left in its wake by 6 cores CPU. The upgrade is warranted, even if you can OC the Sandy chip to 4+ghz.
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Just_Cause_3_-test-new-jc_3_proz.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Syndicate-test-new-as_proz.jpg
 

DonMigs85

Member
Had it happen to me recently managed to pick up a decent Gigabyte UD7 for my 2700k got it for a good price too off ebay. Give it another year and you'll be fucked though. At least for good motherboards.

Even today you can still buy new LGA775 motherboards, but they're usually cheap combo types that support both DDR2 and DDR3.
 
The CPU does matter a lot at 1080p. The 2500K runs at 3.2ghz stock if memory serves me right.

Personal gripe, I hate that these things never test my CPU. The 5820K. I can see how big of a difference it is from similar CPUs though from my previous FX-8350
 
The CPU does matter a lot at 1080p. The 2500K runs at 3.2ghz stock if memory serves me right.
http--www.gamegpu.ru-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-d_proz.jpg

The Division is no exception, feel free to educate yourself.

The 2500K can easily be left in its wake by 6 cores CPU. The upgrade is warranted, even if you can OC the Sandy chip to 4+ghz.
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Just_Cause_3_-test-new-jc_3_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Syndicate-test-new-as_proz.jpg

These are just some of the very few examples where more than 4 cores matter, and hyperthreading is just effective. See how high up the ancient 2600K is. Might become more important in the future, given that the consoles have more than 4 cores.

Furthermore, the whole point of the K line of CPUs is a free multiplier and hence effortless overclocking.

I feel I don't particularly need more education on the subject, thank you, but I'm always open to new evidence and will change my mind accordingly.

Edit: this is from the POV of a 60 FPS gamer. I can see that things might look differently for people on 120+ Hz screens.
 
That is just one of the very few examples where more than 4 cores matter, and hyperthreading is just effective. See how high up the ancient 2600K is. Might become more important in the future, given that the consoles have more than 4 cores.

Furthermore, the whole point of the K line of CPUs is a free multiplier and hence effortless overclocking.

I feel I don't particularly need more education on the subject, thank you, but I'm always open to new evidence and will change my mind accordingly.

His point was about how aged a 2500K was. He EDIT: (might have been somebody else who directly said this, but he was still showing the 2500K) even said you could pass with a 2600K nowadays.
 

Kezen

Banned
Personal gripe, I hate that these things never test my CPU. The 5820K. I can see how big of a difference it is from similar CPUs though from my previous FX-8350
It should be a substantial upgrade. The 5820K should do better than the 8 threads I7s.

We are at a point where >4 threads brings very tangible improvements. The 2500K is not "outated" but it is falling behind in many games, but it remains true that overclocking it is rather easy.

that is just one of the very few examples where more than 4 cores matter, and hyperthreading is just effective. See how high up the ancient 2600K is. Might become more important in the future, given that the consoles have more than 4 cores.

Furthermore, the whole point of the K line of CPUs is a free multiplier and hence effortless overclocking.

I feel I don't particularly need more education on the subject, thank you, but I'm always open to new evidence and will change my mind accordingly.

You can educate yourself by doing more research regarding CPU usage in modern games, you will see that the games I've quoted are not the exception. My point was merely that the 2500k is falling behind and that 4 threads is starting to become limiting. Just look at the lower clocked CPUs doing better partly because of their core count.

Fair point regarding overclocking, as I've noted in my previous post but that applies equally to 6+ cores CPUs. The gap only widens then.

Onionpowder is absolutely correct, 6 cores such as his CPU will definitely improve performance on a wide number of AAA games.
 

mrklaw

MrArseFace
The point surely is that gamers are irrelevant comoared to the mass market. Current chips are just fine the GPU that needs to push on.

In the general home use scenario for Facebook/chrome/office apps, current CPU performance is fine, and SSDs helo in making the comouter generally more responsive. Ultrabooks have become more common and even normal fat laptops are using ultrabook CPUs so that they can either offer longer battery life, or the budget concious manufacturer can put a smaller battery in for the same life.

They've been treading water for a while now.
 

DonMigs85

Member
I wish Intel would also just produce CPUs with higher core counts and no integrated graphics. I know some people who got Haswell Xeons at good prices for that reason.
 

Kudo

Member
So, what'll be the "last" CPU worth upgrading to until Intel figures this shit out?

I'm on a 3570k now.

I'd assume Skylakes or Haswell-E's, afaik Skylake has better single core performance but Haswell-E has 6 cores compared to Skylakes being 4.

I don't think you have any reason to upgrade from 3570k though.
 

Zojirushi

Member
I'd assume Skylakes or Haswell-E's, afaik Skylake has better single core performance but Haswell-E has 6 cores compared to Skylakes being 4.

I don't think you have any reason to upgrade from 3570k though.

Well people ARE being bottlenecked by their 2500k if they rock high end GPUs already and that'll only get worse over time, so I'm guessing the same goes for a 3570k.
 

Kudo

Member
Well people ARE being bottlenecked by their 2500k if they rock high end GPUs already and that'll only get worse over time, so I'm guessing the same goes for a 3570k.

True enough, spoke out of the line there. 3570k should still be good for 1080p gaming and maybe more if overclocked.
If you have high end GPU and plan to play 4k/60+fps it's not a bad idea to have high end CPU too, being bottlenecked by CPU is worst as that's the hardest part to change.
 

dark10x

Digital Foundry pixel pusher
Hard to say whether your improved gaming performance is due to CPU or something else. At the same time, you changed your motherboard, RAM and possibly did a new system install. Without having a clear A/B comparison and benchmarks, your perceived improvement could thus be due to other factors or just be pure placebo effect.
I know it's not a placebo as I was in the middle of testing Rise of the Tomb Raider on the PC when I switched over. My Windows 10 install on the 3570k was only a few months old anyways.

The village sequence always maxed out all four cores on the older CPU and the frame-rate dropped like a stone. Now, it's just GPU limited but still runs much better.
 
True enough, spoke out of the line there. 3570k should still be good for 1080p gaming and maybe more if overclocked.
If you have high end GPU and plan to play 4k/60+fps it's not a bad idea to have high end CPU too, being bottlenecked by CPU is worst as that's the hardest part to change.

You have it the wrong way, the higher the resolution, the less influence the CPU has. With modern GPUs, one can run into scenarios where the game runs CPU limited at 1080p. However at 4K, any graphically intense game will be GPU limited for a long time to come.
 
When are the new desktop processors out?
I've got a 3770k 4.5ghz, and I 'try' to game at 4K 60hz, I don't really want to upgrade as it would mean a new Mobo etc, is that chip realistically gonna keep me ok for a good while?
 

mrklaw

MrArseFace
I know it's not a placebo as I was in the middle of testing Rise of the Tomb Raider on the PC when I switched over. My Windows 10 install on the 3570k was only a few months old anyways.

The village sequence always maxed out all four cores on the older CPU and the frame-rate dropped like a stone. Now, it's just GPU limited but still runs much better.

I think an older i7 would stay relevant for longer as the hyper threading will help a ton

But assuming most consumers would buy an i5 - 4 cores, no hyper threading. Would a recent broadwell/skylake i5 significantly outperform an older 2500/3570?
 

DonMigs85

Member
I think an older i7 would stay relevant for longer as the hyper threading will help a ton

But assuming most consumers would buy an i5 - 4 cores, no hyper threading. Would a recent broadwell/skylake i5 significantly outperform an older 2500/3570?
Anandtech said that Skylake's IPC is on average 15-20% faster than Sandy Bridge and about 7% faster than Haswell. I think it would be best to wait till Kabylake at least. It'll include a lot of new AVX-512 instructions as well as SHA-1 acceleration which may prove useful one day. Or when games actually start making heavy use of AVX-2, that's another good time to upgrade from Sandy or Ivy Bridge.
 

Kudo

Member
You have it the wrong way, the higher the resolution, the less influence the CPU has. With modern GPUs, one can run into scenarios where the game runs CPU limited at 1080p. However at 4K, any graphically intense game will be GPU limited for a long time to come.

I'd assume you still need significantly more juice when playing newer games that use multiple cores, if you're getting bottlenecked by CPU at 1080p and want to play 4k with smooth fps?
 

DonMigs85

Member
I'd assume you still need significantly more juice when playing newer games that use multiple cores, if you're getting bottlenecked by CPU at 1080p and want to play 4k with smooth fps?
Higher resolutions don't really make the CPU work any harder. 4K can cut your framerate to 1/3 or 1/4 what it was at 1080p so it's unlikely you'll ever be CPU-limited.
 

Nachtmaer

Member
I wish Intel would also just produce CPUs with higher core counts and no integrated graphics. I know some people who got Haswell Xeons at good prices for that reason.

Well, they've been doing that for desktop for years now. Ever since LGA1366 (and now LGA2011-x), Intel has a desktop version of their "smallest" server chips with 4 up to 8 cores + HT. A fully enabled Broadwell-E will even have 10 cores. You really don't need to get a Xeon to have a higher core count CPU without integrated graphics. Unless you really need 28 cores.
 
Top Bottom