• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel: future CPUs to be slower but more efficient

Deku Tree

Member
Intels biggest competition now is ARM. Majority of computer chips of the future are going to be low power consumption going into appliance like devices. Of course they have to focus on power consumption.
 

kennyamr

Member
This is the end of an era but it also means the rise of quantum computing.
Imo that is super exciting. It is finally happening and it will def change the world in the upcoming years. Of course the current technology finally hit that wall after all these years but we all knew it was coming and somehow I'm glad that they are ready to move on.
We will have to wait yes, but the rewards will be so good.
#believe
Also Half Life 3 confirmed, why not.

For more info: https://www.youtube.com/watch?v=JhHMJCUmq28
 

Trojan X

Banned
For my gaming, no problem.
For my 3d rendering, much much more power, please. I want Star Trek CPU power in a single chip.
 

Heigic

Member
That and I think the majority of their customers would prefer 10% more battery life in their laptop over a 10% faster processor.
 

Theonik

Member
Even if there was competition, it isn't like gamers are some super large cash source (even if, per product, they pay a nice premium).

Mobile with low TDP and high efficiency is where the moolah is, my friend.
Not really. Mainstream has moved to portable computers sure, but mainstream devices are usually a race to the bottom. Intel has been fortunate to not have competition in that space to allow them to have some viable pricing, but they can't break into the ultra mobile space ARM cores are occupying at the moments so are dealing with a fair bit of attrition there. Where Intel makes big bucks is with their Xeon server cores that have very high margins and are growing, whilst their client revenues and profits are dropping.

It is also important to note that for Intel's client processors they are seeing decline everywhere except enthusiast desktops at the moment which while still niche are sufficient to subsidize their continued existence.
 

Skinpop

Member
I wish Intel would also just produce CPUs with higher core counts and no integrated graphics. I know some people who got Haswell Xeons at good prices for that reason.

for what purpose? I get the benefit if you are doing offline rendering or some kind of parallell computing but for most purposes a few strong cores is much better than many weak.

I wish intel would have continued with Larrabee and proceeded to fully integrate gpus with cpus so that we could go back to having one beefy processing unit instead of this cpu + gpu nonsense we have today.
 

pestul

Member
Do you have any idea how much energy a PC consumes vs any of those other home appliances? We are taking orders of magnitude here and significantly lower efficiencies. If PCs were anywhere near as bad not only would they already be legislated upon they just would flat out not work.
True for general application. When I'm gaming, my PC consumes more energy than just about every device or appliance in my house. Ranking above it of course are space heaters and the hot water boiler. I don't count the short burst appliances like kettles or oven/stove because of the time frame of use.

Two years ago I used my gaming PC (including gpu) to perform Folding@Home. The household power usage was up 20-30% consistently for those short 4 months I did that. CPUs have become more efficient since my i7 920, but GPUs are awful hogs.
 

etta

my hard graphic balls
Any good 2500k over clocking guides? Or is it mobo specific? I haven't overclocked in like 10 years.

Mobo-specific but mostly the same stuff, gotta change some settings to work better with a multiplier change.
If you have any decent brand mobo it should be relatively pain-free.
 

KKRT00

Member
Are you perhaps assuming that I want to run the latest games at the highest settings?
I'm playing R6S on mostly High, some Very High just fine at 1080p/60 with a 2500K/680.

R6S runs in 60fps on consoles, of course it will run great on i5 2500k in 60fps.

But from other games, i'm not talking about highest settings. I played Overwatched on lowest settings on i5 2500k@4ghz and it dropped frequently below 100fps in combat and game unfortunately does not feel smooth below 100.
Also any high end title that will be 30hz on consoles, have a great possibility to struggle to run in 60hz on i5 2500k.
 

dave_d

Member
In the future people may not upgrade when their cpu/mobo is obsolete but instead when it simply dies. Looks like we're going to find out what hardware company's hardware lasts a long time and who's doesn't.
2600k @ 4.6ghz + asus p8z68v pro gen3 still going strong here since 2011

That sounds like my bro. He had my old Q6600 DP35DP rig with 8 gigs of ram and it was easier just to get a new motherboard, cpu, and ram combo ($200) than trying to figure out which part actually broke. (We're guessing it was a puffy capacitor on the motherboard but no idea which part really broke.)

Hey anybody know of a market for a semi functional cpu/motherboard/ram combo? :)
 

etta

my hard graphic balls
R6S runs in 60fps on consoles, of course it will run great on i5 2500k in 60fps.

But from other games, i'm not talking about highest settings. I played Overwatched on lowest settings on i5 2500k@4ghz and it dropped frequently below 100fps in combat and game unfortunately does not feel smooth below 100.
Also any high end title that will be 30hz on consoles, have a great possibility to struggle to run in 60hz on i5 2500k.

Then I can just lock it to 30fps, unless it's a shooter.
The point is, there's diminishing returns on upgrading for those people that can sacrifice the highest setting and/or FPS.
 

LilJoka

Member
Intels biggest competition now is ARM. Majority of computer chips of the future are going to be low power consumption going into appliance like devices. Of course they have to focus on power consumption.

Yes and the enterprise market is just going to disappear, buying dishwashers and microwaves instead, Do you see ARM or AMD doing anything for high performance computing? Cloud computing? Big Data?

Better to sell many cheap CPUs in appliances that nobody ever upgrades? Or somewhat less extremely expensive Xeons which companies rotate quite often?

I dont think Intel have anything to worry about.
 

Theonik

Member
True for general application. When I'm gaming, my PC consumes more energy than just about every device or appliance in my house. Ranking above it of course are space heaters and the hot water boiler. I don't count the short burst appliances like kettles or oven/stove because of the time frame of use.

Two years ago I used my gaming PC (including gpu) to perform Folding@Home. The household power usage was up 20-30% consistently for those short 4 months I did that. CPUs have become more efficient since my i7 920, but GPUs are awful hogs.
I am not really convinced. For the most part, PC parts will remain idle for most general computing and only heat their peak consumption for very small amounts of time. Even high-end GPUs I would argue are quite efficient now for the performance you are getting. If you want to lower your consumption, you can sacrifice fidelity to lower your card utilization which of course you won't do.

I also would not discount devices which are only turned on for small periods of time from this equation. The average user of even high-end gaming PCs is not likely to be using their gaming PC at full tick on a weekly basis for longer than they are using their cookers especially considering how much more these other devices are drawing for the time they are used.
 

KKRT00

Member
Then I can just lock it to 30fps, unless it's a shooter.
The point is, there's diminishing returns on upgrading for those people that can sacrifice the highest setting and/or FPS.

If You are fine with 30hz sure, but only for this generation excluding VR - sure, but this was known since the first current gen console specs leaked.
 

Vanillalite

Ask me about the GAF Notebook
They are innovating... just not towards stuff that enthusiast gamers care about.

IE. their profit margin from server chips (which value efficiency) is boss compared to desktop. However Google, their biggest customer, is trying to get into ARM for the power efficiency.

This is the real takeaway.

Money isn't in improving consumer level stuff anyways.
 

mrklaw

MrArseFace
for what purpose? I get the benefit if you are doing offline rendering or some kind of parallell computing but for most purposes a few strong cores is much better than many weak.

I wish intel would have continued with Larrabee and proceeded to fully integrate gpus with cpus so that we could go back to having one beefy processing unit instead of this cpu + gpu nonsense we have today.

Integrated GPU takes up half the die these days. You could ditch it and put an 8 core CPU with hyper threading in the same space. But then where is the business reasoning to do that? Intel wiljust continue to price those 6-8 core CPUs out of the reach of most people.
 

zoku88

Member
for what purpose? I get the benefit if you are doing offline rendering or some kind of parallell computing but for most purposes a few strong cores is much better than many weak.

I wish intel would have continued with Larrabee and proceeded to fully integrate gpus with cpus so that we could go back to having one beefy processing unit instead of this cpu + gpu nonsense we have today.
Intel did pretty much continue with Larrabee. That's kind of what the Knights chips are. Not for general purpose computing though.
 

Livedili

Banned
Looks like my i7 6700k will last until the day I die or quantum cpus are introduced at a customer-affordable price. Can't say I'm mad. Actually, that saves me a lot of money :)
 

McHuj

Member
Looks like my i7 6700k will last until the day I die or quantum cpus are introduced at a customer-affordable price. Can't say I'm mad. Actually, that saves me a lot of money :)

Lol. What are people saying they this. The Intel guy is talking about tech that will come into play once process shrinks stop, probably a decade from now, not next year year or two.
 

Easy_D

never left the stone age
So, while it make take some time, the next big step is basically gonna be huge, isn't it?
 

Skinpop

Member
Integrated GPU takes up half the die these days. You could ditch it and put an 8 core CPU with hyper threading in the same space. But then where is the business reasoning to do that? Intel wiljust continue to price those 6-8 core CPUs out of the reach of most people.
I'm not asking for an eight core cpu with HT, I'm asking for an integrated GPU/CPU with beefier GPU - beefy to the point where discrete cards are obsolete. I'm asking for something where we don't even make the distinction between gpu/cpu but rather just have a PU that does both. The current system is just an inefficient black box that causes grief to devs as well as consumers. You'd lose the ability to upgrade the gpu but on the other hand the system would be balanced so there'd be less incentive to do so anyway.

Intel did pretty much continue with Larrabee. That's kind of what the Knights chips are. Not for general purpose computing though.
well I want the larrabee we would have had today if they'd gone forward with their plans pre 2010.

So, while it make take some time, the next big step is basically gonna be huge, isn't it?
probably not. intel doesn't have an incentive for taking huge steps.
 
I'm not asking for an eight core cpu with HT, I'm asking for an integrated GPU/CPU with beefier GPU - beefy to the point where discrete cards are obsolete. I'm asking for something where we don't even make the distinction between gpu/cpu but rather just have a PU that does both. The current system is just an inefficient black box that causes grief to devs as well as consumers. You'd lose the ability to upgrade the gpu but on the other hand the system would be balanced so there'd be less incentive to do so anyway.
Basically you're asking for a console APU.

It's a trade-off between having less latency among different processors (APU) and having more flops/thermal headroom/combined die size/transistor count (discrete CPU + GPU).
 

Theonik

Member
Lol. What are people saying they this. The Intel guy is talking about tech that will come into play once process shrinks stop, probably a decade from now, not next year year or two.
Processor shrinks have already slowed significantly and will only get slower still. Broadwell was massively delayed because of this and the next step is only likely to be harder. From the perspective of consumers, this likely means that if anything, we are likely to enter this trend sooner rather than later.
 

Skinpop

Member
Basically you're asking for a console APU.

Maybe, I don't know to what degree they are integrated so it's hard to say if it's what want.
I want it made by intel though, and for use in a regular pc.

It's a trade-off between having less latency among different processors (APU) and having more flops/thermal headroom/combined die size/transistor count (discrete CPU + GPU).
and that's why I'm bitter about larrabee. Six years into it I'm sure the tech would have matured into something great.
 

Crisium

Member
But until Quantum comes, eventually all these 5-10% IPC improvements per generation will add up though and make you upgrade your Sandybridges. Maybe.
 

tbhysgb

Member
A lot of you don't remember the Pentium days do ya? They hit a wall speed wise and had to get that speed through architecture. Same thing here, chips might hit a wall cycle wise but we'll see more instructions per cycle if need be.
 

Locuza

Member
Basically you're asking for a console APU.
Actually I find the HPC-APU concept pretty cool.
Full blown CPU + GPU on a package, linked through strong coherent fabric.
Simply fixing the bad latency and bandwidth of PCIe + guarantee coherent data.

Maybe, I don't know to what degree they are integrated so it's hard to say if it's what want.
I want it made by intel though, and for use in a regular pc.


and that's why I'm bitter about larrabee. Six years into it I'm sure the tech would have matured into something great.
They are tight integrated.
CPU + GPU are sharing a coherent ringbus, with very low-latency and high throughput.
The newest Skylake Iris Pro will have over 1TF, roughly Xbox One Level (in theory).

Larabee of course goes one step further, but is choosing other compromises.
 

jose1

Member
If cpus hit a brick wall and games continue taking advantage of multithreading, I might just build a system with dual Xeons and be good for life.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
One thing is clear - the single-thread performance lunch is over. And Amdahl's law is not going anywhere.
 
Looks like my i7 4790 will last me a good while. I havent had any issues with it at all. Runs smoothly. In 20 years I guess we will be using quantum computers which will make computers today look slow.
 
I just upgraded from a 3570k to a 5820k and saw massive gains. I think the 2500k is adequate but outdated. My workflow has improved greatly with the new CPU and a lot of the slowdown I was getting in modern games has cleared up without changing my GPU.


Not necessarily but there are many other tasks that benefit greatly from increased CPU power. I spend a lot of time producing video content and any amount of CPU you can throw at it has a beneficial effect.

outdated? lol. I still crank games to ultra
 
I dont understand those posts. Are You not playing demanding games on Your PC, because 2500k is already showing its age in games.
8 threads are a minimum for high performance gaming right now. I could understand 2600k@4.6, but 2500k is getting really old.

---

Yep.

you have a vendetta against those with 2500k? sorry to say the 2500k is still a beast. I run it oc'd to 4.5 and a gtx 980 with everything on best settings. flawless
 
Top Bottom