• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel: future CPUs to be slower but more efficient

Renekton

Member
Intel calls end to Moore's Law

Not just that Moore’s Law is coming to an end in practical terms, in that chip speeds can be expected to stall, but is actually likely to roll back in terms of performance, at least in the early years of semi-quantum-based chip production, with power consumption taking priority over what has been the fundamental impetus behind the development of computers in the last fifty years.

What does this have to do with gaming?
  • Your 2600K or 5820K might last a very long time =/
  • No point waiting for next CPU gen for your gaming rig unless you want future tech like Optane
 

reKon

Banned
My 4690K will do quite fine. Plan on keeping the PC I built for the next 5+ years and just maintaining that. Will buy 1 GB SSD when that hits $150 and buy a more powerful GPU when VR becomes more prevalent.
 

Renekton

Member
They don't have to innovative, they have a monopoly.
They are innovating... just not towards stuff that enthusiast gamers care about.

IE. their profit margin from server chips (which value efficiency) is boss compared to desktop. However Google, their biggest customer, is trying to get into ARM for the power efficiency.
 

10k

Banned
My 3770k at 4.2Ghz will last me until I quit PC gaming I guess. Awesome. Upgraded from an i7-920, which might still be useful for today's games too lol.
 

_woLf

Member
Still running on my non-overclocked 3770K and I haven't had a single issue with performance where the CPU is to blame.
 
I find it hard to believe that. If AMD had comparable performance Intel would find a way to improve perf.

They can already. The process of Die shrinks is just getting too difficult and expensive right now. Comparitively focusing on TDP and power usage is a much better use of resourced and more easily obtainable. They could continue to increase processing performance by leapa and bounds but the costs would be so astronomical it would see no adoption.
 
I find it hard to believe that. If AMD had comparable performance Intel would find a way to improve perf.
Intel's stuck on 14nm for CPUs. The theoretical limit with silicon is 5nm or 7nm (e: some say 3nm, I don't believe them), depending who you ask.

But they're having a ton of trouble even getting to 10nm. Competition isn't going to fix this.

GPUs are hitting a similar wall at what, 28nm? The free upgrades couldn't last forever.
 

Alvarez

Banned
In this case it's the laws of physics that are the problem.

Everybody read the above, as it's the most important sentence that will be formed in this thread.

Until the next Einstein comes along to either resolve the Heat Problem or dramatically decrease the price of modern computer components (which will happen naturally... over decades' worth of time), there really is nothing that Intel or anyone can do to significantly push processor speeds beyond top-top end hardware. --Well, not without things literally erupting in flame.

By the way, if any of you do have any solutions to micro and nanoscopic heat management beyond "make it bigger" (that's not a solution), then you're the richest man on Earth walking and you should go patent that shit right now. And no, an ice cube on the CPU will not help.
 

DBT85

Member
Did anyone else stop swapping CPUs yearly once the Q6600 arrived? Ever since then I'm on a 4 year cycle.
 

RedSwirl

Junior Member
Everybody read the above, as it's the most important sentence that will be formed in this thread.

Until the next Einstein comes along to either resolve the Heat Problem or dramatically decrease the price of modern computer components (which will happen naturally... over decades' worth of time), there really is nothing that Intel or anyone can do to significantly push processor speeds beyond top-top end hardware. --Well, not without things literally erupting in flame.

By the way, if any of you do have any solutions to micro and nanoscopic heat management beyond "make it bigger" (that's not a solution), then you're the richest man on Earth walking and you should go patent that shit right now. And no, an ice cube on the CPU will not help.

Aren't they trying to figure out something else out of which to build processors?
 

Gamerman1

Member
Friend of mine is still rocking a Q6600 with 8GB od DDR2 ram. He has a Geforce 970 and a SSD also. No problems running any games at High/Max settings at 1080p.
 

SpotAnime

Member
My rig that I built in 2010 has an i7 950 and is still going strong.

So I would tend to agree with their statement. In fact I would think the trend is already underway.
 

Occam

Member
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.
 

Guess Who

Banned
They can already. The process of Die shrinks is just getting too expensive right now.

They've still done die shrinks every other generation (until Kaby Lake). But as these things get smaller and smaller it's getting harder and harder to manufacture them at acceptable yields - the reason Broadwell took so long is because getting their 14nm process working was fucking difficult. Plus we're starting to hit the physical limits of how small you can make transistors. 5-7nm is about as small as die-shrinks can get you.
 
Friend of mine is still rocking a Q6600 with 8GB od DDR2 ram. He has a Geforce 970 and a SSD also. No problems running any games at High/Max settings at 1080p.

Speaking from actual experience with a very similar processor (Q9550), your friend's 970 is being seriously bottlenecked. He would benefit hugely from a CPU upgrade.
 

LOLCats

Banned
Damn this i7 3770k is lasting a long time.

My GTX 670... Not so much, seems like it got slower... Sorta kidding but not really.
 

Steel

Banned
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.

Joke post?
 
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.

Wii U will rise again from the ashes with its low power consumption.

Just wait.

A-Third-of-Leads-Are-Not-Getting-Called-Back-skeleton-waiting-image.jpg
 
Aren't they trying to figure out something else out of which to build processors?

Graphene? Yeah, that's a bit of a ways off, though. It seems like we're really approaching the limits of what we can do with traditional x86 silicon processors.

IBM POWER mass adoption when?
 
Top Bottom