• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel: future CPUs to be slower but more efficient

onQ123

Member
Thanks to HSA PC & Consoles will go back to having specialized co-processors because now they can work together better.

Why have a beastly CPU running hot & using a lot of energy when you can have a specialized processor for the task that can do the job more efficiently?
 

Steel

Banned
Ignorant reply?

The CPUs you see in gaming rigs are CPUs you'd find in any given desktop depending on the year you buy it. Or console. The difference between gaming PCs and everything else is the GPUs, mostly, which already have a wide range of power efficiencies to choose from and don't always run at max tilt.
 

AmyS

Member
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.

Next Gen canceled.
 

Sakura

Member
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.

Doesn't that kind of depend where you get your electricity from?
We get electricity from a renewable source, so I don't really see how a gaming PC for me would be environmentally unfriendly.
 
Makes sense to me. 10nm is being particularly hard to get decent yields from, there's no real reason for them raise performance at the cost of area/power and everything is going into the mobile space anyways. Heck, with Apple's A9 chips being as wide as they are, ARM procs will hit the same kinds of problems (but then also try to stay within a strict power envelopes).
 
Still using my 3year plus 3770k which hasn't been overclocked...

Hey, me too! I'm probably gonna upgrade some time this year just so I can finally have USB3 and SATA3 without needing expansion cards. Skylake's prices and availability are bullshit right now, though, so I'm not sure what I'll get.

OT: Given the industry-wide problems we've seen with die shrinks in recent years, I can't say this is surprising. Moore's law had a great run but it couldn't last forever.
 

Steel

Banned
Doesn't that kind of depend where you get your electricity from?
We get electricity from a renewable source, so I don't really see how a gaming PC for me would be environmentally unfriendly.

Even if you played on a gaming PC full tilt 12 hours a day 365 days a week it'd still use less energy than a refrigerator in the same time period.
 
Next Gen canceled.

All hail the mobile overlords. Konami won.

The CPUs you see in gaming rigs are CPUs you'd find in any given desktop depending on the year you buy it. Or console. The difference between gaming PCs and everything else is the GPUs, mostly, which already have a wide range of power efficiencies to choose from and don't always run at max tilt.

I wonder what he would think if he saw the energy consumption of data centers.

Better shut down Google, Facebook, Microsoft.
 
I hope this means AMD can come up now and provide more affordable, but competitive alternatives.

AMD doesn't have the R&D for that. Intel's spending more on R&D than what AMD spends in total yearly. AMD's just getting to 14nm FinFET next year with Zen and that's not even their tech since they don't own fabs anymore, that's Global Foundaries' work. AMD will still be stuck in the same position they are now, irrelevant on the high end with some mid and low range CPU's that might be worth buying if they are cheap enough.

If Intel can't get the job done without needing another type of material to get processors smaller/faster, no one will. Intel's got the most money and the most talent and if they say shit's hard to do, you better believe it.
 

Occam

Member
The CPUs you see in gaming rigs are CPUs you'd find in any given desktop depending on the year you buy it. Or console. The difference between gaming PCs and everything else is the GPUs, mostly, which already have a wide range of power efficiencies to choose from and don't always run at max tilt.

I know, but that won't stop legislature from imposing limits in the future.
Like I said, it's already happening for all sorts of devices, for instance light bulbs and vacuum cleaners in the EU.

Game console makers are trying to preempt this in Europe: http://ec.europa.eu/growth/tools-databases/newsroom/cf/itemdetail.cfm?item_id=8239&lang=en
 
I still have no idea if I should upgrade my CPU. It feels like no tech progress has been made at all, the CPUs are still as expensive as they were at summer 2012 when I bought mine.

I now have a 980 Ti as my GPU, but my CPU is a little i5 2320 (3.0GHz). I don't know if this is going to be a bottleneck, I'm certainly getting huge frame rate drops (down to about 20 fps) in AC Syndicate at max settings.
 
Thanks to HSA PC & Consoles will go back to having specialized co-processors because now they can work together better.

Why have a beastly CPU running hot & using a lot of energy when you can have a specialized processor for the task that can do the job more efficiently?

Doubt it. Modern CPUs/GPUs are not inefficient. You just have to think about where you want to run your code and how you write it. Development of big, specialized and exotic processors isn't something we will see anytime soon again in the console world, the costs are just too high. Nobody wants to spend countless billions on chip development. Complexity and requirements are just too high today. The current model of using off-the-shelf components is extremely successful. I don't see them changing this in the near future.
 
Shit, I'm fine with that. Gives me opportunity to save my money for other things such as new phone every 2 years, car, new Nvidia Ti once a year, VR, etc.
 

GhaleonEB

Member
article said:
Intel has said that new technologies in chip manufacturing will favour better energy consumption over faster execution times – effectively calling an end to ‘Moore’s Law’, which successfully predicted the doubling of density in integrated circuits, and therefore speed, every two years.
Moore's Law is about transistor density and cost, it does not govern how those transistors are used in processor design (performance vs. power).
 

Flai

Member
I find it hard to believe that. If AMD had comparable performance Intel would find a way to improve perf.

They could probably easily increase the performance, but at cost of efficiency/power consumption. And that's not what they are aiming for.
 

Helznicht

Member
I still have no idea if I should upgrade my CPU.
I now have a 980 Ti as my GPU, but my CPU is a little i5 2320 (3.0GHz). I don't know if this is going to be a bottleneck,

Definitely going to be holding you back on a majority of games.

I would pair a i5 3.5 with a 970, i5 4ghz with a 980, and be overclocking with a TI.

As a price conscious upgrade the i5 4690k can be found just under $200 on sale. It supports turbo boost out of the box up to 3.9ghz and OC's fairly well with a good cooler (~4.5ghz)
 
They could probably easily increase the performance, but at cost of efficiency/power consumption. And that's not what they are aiming for.

Intel could pull a FX-9590 out if they wanted to, explode power consumption for some small gains in performance, but outside of specific use cases, notably gaming here, the consumers of high end CPU performance don't want that.

Gamers are in a tough spot in that they want really, really high single threaded IPC since games generally work best with that. Most other consumers of high end CPU's, like server farms, supercomputing, they would rather have more cores with more efficiency, even if they aren't particularly well performing individually. Intel focusing on more efficient cores allows them to make server farms with more cores in the same power envelope, increasing performance for those applications. Considering the amount of money Intel makes on those customers, compared to people buying i7-6700k's to overclock them, you can see why Intel makes the decisions it does. For the overwhelming majority of desktop computing, by the time we hit Sandy Bridge we probably solved the problem of having 'fast enough' CPU's.

Finding ways to get game engines to have better multithreading, which is something DX12 and Vulkan promise to achieve, are the ways we're going to see big jumps in CPU related game performance in the near future.
 

Lebon14

Member
Makes me remember of the Pentium 4 days to the Core 2 Duo/Quad days where P4s were like 3.0+GHz x86-only single core to x86-x64 1.5GHz-2.8GHz Dual & Quad core CPUs. I think the performance will be better, it's just that the cores will be much more efficient to treat information: Less latency, more threads per core, etc.

I'm still on a Core i7 950 personally (at 3.07GHz), not overclocked. I'll only be upgrading when this one breaks completely.
 
I'm just waiting for the successor to the gtx 970 before I upgrade. It's going to be ram and gpu upgrades and possibly a larger SSD from now, my 3770k will stay.
 
It's funny seeing Intel scramble to compete with ARM but at the same time it's coming at the cost of desktop power. :(
Well with the sales of computers on the decrease, except for Apple, this is probably a smart decision for Intel to make to compete with ARM. I think at some point they will go back, they still have enterprise processors to make besides consumer processors.
 
Given how the average PC doesn't really utilize the full capacity of current processors I'm not surprised at this outcome.

Only servers, commercial and gaming PC's require high speeds, and even they can benefit from being more efficient.
 
My 3770k at 4.2Ghz will last me until I quit PC gaming I guess. Awesome. Upgraded from an i7-920, which might still be useful for today's games too lol.

I'm still gaming with my i7 920. Starting to feel the itch to upgrade though.
 

onQ123

Member
Gemüsepizza;194388992 said:
Doubt it. Modern CPUs/GPUs are not inefficient. You just have to think about where you want to run your code and how you write it. Development of big, specialized and exotic processors isn't something we will see anytime soon again in the console world, the costs are just too high. Nobody wants to spend countless billions on chip development. Complexity and requirements are just too high today. The current model of using off-the-shelf components is extremely successful. I don't see them changing this in the near future.

Who said anything about big exotic processors? I'm talking about using specialized processors from companies that specialized in things like audio , video , computer vision , neural networking & so on. making the CPU job easier.
 
Top Bottom