• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel: future CPUs to be slower but more efficient

Blizzard

Banned
Intel: "We can sit on our asses, cause really who is gonna catch up with us?"
It's been suggested earlier in the thread that Intel spends $10 billion annually on research, and that the laws of physics are actually causing issues here.

I don't think it's fair to say they're "sitting on their asses" if those are true.
 

TheSeks

Blinded by the luminous glory that is David Bowie's physical manifestation.
[*]Your 2600K or 5820K might last a very long time =/

...Not seeing the problem here. From a consumer perspective this is great in saving money in the long run. The only people that'd be disappointed are the ones that want to throw money out the window to render 16K Movies or some shit insanity like that.
 

xenist

Member
My overclocked 2600K is still hillariously overpowered for games. I really see myself going for a full decade with it.
 

clem84

Gold Member
Then guess it's a good time to buy a new rig. A CPU bought right now will most likely remain competitive for a good 10-15 years.
 

DonMigs85

Member
Perhaps they also mean they'll become wider but initially with relatively low clockspeeds. Kinda like HBM versus GDDR5
 

BAW

Banned
I am clueless about PC gaming, but all the 2500k/2600k replies puzzle me, were the next generations just minor % improvements or something?
 

wildfire

Banned
Maybe then AMD can catch up.

Sure but even with AMD's promise of having at least 40% IPC improvement over their previous architecture they still won't reach parity up for another 2-2.5 years.


I am clueless about PC gaming, but all the 2500k/2600k replies puzzle me, were the next generations just minor % improvements or something?



Roughly 15% IPC each year and then you have to factor in that reviews for low overhead APIs like Mantle and the new DirectX 12 show bigger performance gains for older Intel CPUs and current AMD CPUs than the newest INtel ones which means the overall gap is going to be closer again for the next year or 2. Software really hasn't done much to make Intel's newest chips a must buy.
 
I am clueless about PC gaming, but all the 2500k/2600k replies puzzle me, were the next generations just minor % improvements or something?

The minor CPU-side improvements do stack up somewhat, but most of the improvements went to the processor graphics (integrated graphics), platform power, and idle power usage. Most things that benefit laptop and tablet designs, yes, but today's desktop should also run very efficiently.

Though, the 2500K and 2600K are still more than fast enough for everything, so...
 
Well with the sales of computers on the decrease, except for Apple, this is probably a smart decision for Intel to make to compete with ARM. I think at some point they will go back, they still have enterprise processors to make besides consumer processors.
It'll be a while before they are fully competitive with ARM in terms of efficiency. The market wants compact, efficient processors for their mobile lives but even Intel's Atom series and "efficient" M Core series still are vastly bigger energy hogs compared to their ARM cousins.
 

GHG

Gold Member
Yeh I feel like I'll be good with my 3570k for a while. Nothing has pushed it to the point where I feel like I have to upgrade it.

My 980ti on the other hand...
 

Fredrik

Member
This is depressing if true. Can't they go for more cores or combining several CPUs on the same mother board or something like that? I never thought I'd see the day when computer tech stopped evolving toward better performance. How do we go forward now? Better programming? Faster memory?
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
My 4790K will last me years, lol

I have one. It's not enough for some games that have terrible multi-threading performance unless you win the jackpot in the overclocking-potential lottery or don't care about 60fps.
 
Nothing unexpected. This is what coming to the end of a computing paradigm is meant to look like. Let's see where we are in a decade though.
 

Some Nobody

Junior Member
Since games pretty much use consoles as a base for visuals, maybe by the time they've figured out a way to make faster CPUs we'll have learned how to pull the best visuals out of the PC CPUs we have now, since today's PC specs are tomorrow's console's.

But I'm firmly in favor of increasing efficiency, so let's get on that for a little while.
 
They are about to begin ramping 10nm technology and are testing 7nm technology. They are constantly innovating.

Plus their chips will become programmable in a few years thanks to the purchase of Altera
 
This is depressing if true. Can't they go for more cores or combining several CPUs on the same mother board or something like that? I never thought I'd see the day when computer tech stopped evolving toward better performance. How do we go forward now? Better programming? Faster memory?

They already do the more cores/cpu's thing in servers and supercomputers. Higher efficiency chips allow Intel to more cores in the same power envelope and size envelopes, which means higher performance for applications that can take advantage of the extra cores. There's still plenty of performance evolution going on, but it's not being aimed at single threaded IPC due to the laws of physics getting in the way.

The issue for performance arises for applications who do not take to paralleization well, which to my understanding includes gaming. While there is still a lot game engines can do to improve multithreading performance, DX12 helps here, it's still going to be a struggle to get huge leaps in CPU performance for gaming until the likes of Intel defy physics or move onto things like quantum computing.
 

Trojita

Rapid Response Threadmaker
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.

Yeah let's turn off all those server farms while we are at it /s
 

ZoddGutts

Member
Sucks for future emulators PS4/Xbone I suppose, though thankfully it looks like the last gen systems 360/PS3/3DS can be handled by current CPU's.
 

Overside

Banned
Everybody read the above, as it's the most important sentence that will be formed in this thread.

Until the next Einstein comes along to either resolve the Heat Problem or dramatically decrease the price of modern computer components (which will happen naturally... over decades' worth of time), there really is nothing that Intel or anyone can do to significantly push processor speeds beyond top-top end hardware. --Well, not without things literally erupting in flame.

By the way, if any of you do have any solutions to micro and nanoscopic heat management beyond "make it bigger" (that's not a solution), then you're the richest man on Earth walking and you should go patent that shit right now. And no, an ice cube on the CPU will not help.

Youre not thinking big enough. Make it so big, that we can comfortably fit inside. Then when you are living inside your processor thats also fully furnished, the access terminals will only require as much space as functionally desired.

Problem solved. House-puters.

Lan involves throwing a string with a can on either end into the neighbors window for voice chat.
 

dark10x

Digital Foundry pixel pusher
My 2500K at 4.6GHz will never have to be replaced, will it?
I just upgraded from a 3570k to a 5820k and saw massive gains. I think the 2500k is adequate but outdated. My workflow has improved greatly with the new CPU and a lot of the slowdown I was getting in modern games has cleared up without changing my GPU.

Well... do we really need more processing power for playing games?
Not necessarily but there are many other tasks that benefit greatly from increased CPU power. I spend a lot of time producing video content and any amount of CPU you can throw at it has a beneficial effect.
 

Sesuadra

Unconfirmed Member
So my 3770k at stock speed will be good for a few years as long as I replace my GTX780 in the future..

Makes me happy because I'll save money and sad because I love buying a new computer
 

Siphorus

Member
(This is going to sound kind of bitter but its just my 0.02$).

I get why you guys are happy that you will be able to keep your CPU's for a long time, but this has far reaching affects beyond gaming. Unless there is a material/approach switch, (be it the higher clocks of graphene, or the smaller density of a carbon nano-tube, or something even more exotic like optical/quantum computing), there may be a lot of stalls in the current progression of technology in many fields. Computing affects many other industries and particularly artificial intelligence (well the hunt for true artificial intelligence) will suffer (even if temporarily) from a delay. On the other hand there is a much needed switch off of silicon coming and the only real thing that may satiate the hunger for more powerful CPUs before a material/approach switch is the approach towards Amdahl's law (which even then we are already close to approaching the limit of). So in many ways this is bad, but it also has the potential for massive breakthroughs down the road.

However I do think Intel is calling it quits rather early as there have been successful 1nm transistors, although the yields on those will be beyond bad. So here's hoping for a materials/approach switch in the near future since at the moment this is rather bad for the progression of technology as a whole. Because in this Intel based future you could just power up more low power machines, but it does not replace out performing ten of those low powered machines for one of the power hungry ones (which you can then get ten of) by any margin, and the more computers you add to super computing clusters the more the network, coordination, messaging, and limits that apply from parallel scaling come into play.
 

Overside

Banned
(This is going to sound kind of bitter but its just my 0.02$).

I get why you guys are happy that you will be able to keep your CPU's for a long time, but this has far reaching affects beyond gaming. Unless there is a material/approach switch, (be it the higher clocks of graphene, or the smaller density of a carbon nano-tube, or something even more exotic like optical/quantum computing), there may be a lot of stalls in the current progression of technology in many fields. Computing affects many other industries and particularly artificial intelligence (well the hunt for true artificial intelligence) will suffer (even if temporarily) from a delay). On the other hand there is a much needed switch off of silicon coming and the only real thing that may satiate the hunger for more powerful CPUs before a material/approach switch is the approach towards Amdahl's law (which even then we are already close to approaching the limit of). So in many ways this is bad, but it also has the potential for massive breakthroughs down the road.

However I do think Intel is calling it quits rather early as there have been successful 1nm transistors, although the yields on those will be beyond bad. So here's hoping for a materials/approach switch in the near future since at the moment this is rather bad for the progression of technology as a whole. Because at the moment while you could just power up more low power machines, it does not replace out performing ten of those low powered machines for one of the power hungry ones (which you can then get ten of) by any margin, and the more computers you add to these super computing clusters the more the network, coordination, messaging, and limit you get from sheer parallel scalability comes into play.


You need to work on your bitter bro.

Would it help if I said true AI was impossible?
 
Well anyone who's just played 'The Division' Beta knows the time when games are stretching your top end, overclocked CPU to the limit....has already arrived.
 

Siphorus

Member
You need to work on your bitter bro.

Would it help if I said true AI was impossible?

I don't know enough about Artificial Intelligence or the approaches for it (neural networks, deep learning) to say for sure, I just brushed up on the topic over time. However I think that on some level, some semblance of intelligence is possible. But my thinking basically boils down to this: Given enough bits, you can simulate a synapse, neuron, etc. Why would you not be able to simulate the brain given enough computing power? True intelligence or a computer that is able to reason beyond our capacity may be a stretch, (however at its core they would be inherently better even on a human's level due to the fact that they remember precisely, don't fatigue or lose focus). But I don't see how intelligence on some level is impossible for a computer. Given enough bits. (Which is a big given in and of itself).
 

orochi91

Member
I have one. It's not enough for some games that have terrible multi-threading performance unless you win the jackpot in the overclocking-potential lottery or don't care about 60fps.

I'm currently at 4.7Ghz.

Did I win the jackpot?

D:
 

Chumpion

Member
This is Intel's idea of foreplay as they prepare to shove an ARM-sized dildo up our collective ass.

Where's my goddamn 8-core laptop CPU, Intel? You gormless bastards.
 

BigTnaples

Todd Howard's Secret GAF Account
So is the future dead?

I mean. Where do we go next?



Surely this is not the pinnacle of computing.
 

PantsuJo

Member
The future can be ARM CPUs, imo.
Even a fork of Windows 10 desktop is in development for ARM systems.

But it's a long, long road.
 
Top Bottom