• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel: future CPUs to be slower but more efficient

d00d3n

Member
I just upgraded from a 3570k to a 5820k and saw massive gains. I think the 2500k is adequate but outdated. My workflow has improved greatly with the new CPU and a lot of the slowdown I was getting in modern games has cleared up without changing my GPU.

Did you overclock your 3570k?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Anandtech said that Skylake's IPC is on average 15-20% faster than Sandy Bridge and about 7% faster than Haswell. I think it would be best to wait till Kabylake at least. It'll include a lot of new AVX-512 instructions as well as SHA-1 acceleration which may prove useful one day. Or when games actually start making heavy use of AVX-2, that's another good time to upgrade from Sandy or Ivy Bridge.
Xeon-only.
 

Averon

Member
Not surprising. New node shrinks are getting harder and harder to pull off at acceptable cost and and heat. There is a reason why it took so long to move from the 28nm node.

Very soon (4-5 years, just about) Intel and others are going to have to move from silicon, or try to supplement it, with another material. Whether it carbon nanotubes, graphene (or some other 2D material), spintronics, III-V materials or something else. No one in the industry knows.

In any event, the next 5-10 years in the semiconductor industry will be interesting to watch.
 
I don't think most people use 980ti SLI at 1080p. No shit you're going to be cpu limited in such a situation.

Excellent point. I was really surprised at the high CPU impact in those benches, should have looked more closely.

Still, it's impressive to see non-theoretical benchmarks where hyperthreading has such a strong positive effect.
 

Kezen

Banned
Still, it's impressive to see non-theoretical benchmarks where hyperthreading has such a strong positive effect.

It's not new.
crysis3vhhton_2.jpg
 

Tarin02543

Member
3770k forever.

In the 90s CPU power was really important as lots of games were software rendered.

My AMD K200mhz was like outdated within a year.
 

Zafir

Member
Guess I can push my new PC build back to next year or maybe later....

My Intel 3570k @ 4.2Ghz still going strong, lol.
 

RE4PRR

Member
i7-5820k ftw. Didn't plan to upgrade from this anyway but if there's even less gains to come from the next few gens that's good enough for me.
 

RiverBed

Banned
I am disappointed by this news. As a tech geek, I love to always see the tech envelope being pushed. I noticed a few years how the speeds were stalling and even reversing. I told myself multi-cores (which also fucking stalled at 4 for years), hyperthreading and all that jazz makes up for it, but there is always a part of me that wasn't convinced. CPUs were also not a bottle neck- overall- but I still want CPUs to keep advancing. a few years back, I really thought 6 and 8 core CPUs would be the norm for new PCs and PC upgrades...

Well, at least SSDs will reach price parity with HDDs by the end of this year or so, and GPUs should see a noticeable step forward this year as well...
 

Genio88

Member
I would have stick with my i7 4770k for long anyway, though slower could just mean about clock speed, you can clock a AMD Phenom to 5ghz but it wouldn't reach the performance of a 3.5ghz i7 for example.I guess they're talking about the mobile scene, they're pretty involved with Microsoft in the Surface serie and with all the other companies who are now copying the Surface, also Microsoft could have asked them a x64 CPU for the rumored Surface Phone, so it's easy to figure out why they're trying to make their CPUs the more efficient and tiny they can. Seems like the future is in the mobile even for real desktop PC
 

kraspkibble

Permabanned.
i'm not building a new pc for about another year but i plan on getting an i7-6700K or i7-7700K (if Kabylake comes out this year) so it sounds like that will last me a long time if people are getting away with Sandy Bridge CPU's.

I hope this allows AMD to catch up and light a fire under Intel's ass.

Well, at least SSDs will reach price parity with HDDs by the end of this year or so, and GPUs should see a noticeable step forward this year as well...

Really? a 1TB 7200RPM cost about £40 and 1TB SSD is about £260. I don't see them dropping ~£200 this year.
 

Durante

Member
Upgrading to a 5820k in 2014 (when it was actually much cheaper in € than it is now!) may turn out to be as good a CPU decision as getting an i7 920 back in 2008.
 

Zexen

Member
^
Yeah, prices are fucking crazy now, bought my 4790k for less than 300€ at launch, now it costs +/- 360€.

Regarding the info, what does it mean for old soft that don't take advantage of new tech/multi-core etc..., will the lower frequencies results in lower performance? Or things will stay on par or slightly better?
 

Qassim

Member
When this starts to happen, I'll buy the latest Architecture-E 5960k equivalent or higher and settle in for the long winter without performance upgrades.
 

Bashtee

Member
Unless they increase the number of cores at a reasonable price, the regular desktop will be doomed. Welcome our new low energy x86 overlords, the competiton to ARM based Models.

I wonder if AMD can come up with a crazy creative solution or just follow Intel.

Well, small steps, first release AMD Zen.
 

TheOMan

Tagged as I see fit
I have some exposure to the industry, and suffice it to say, CMOS (on silicon) is the elephant no one can dislodge.

To give an analogy: there was speculation that the industry was going to move onto X-Ray lithography over a decade ago because foundries were hitting the limits of UV. And they're still on UV, because the cost of retooling (and the special materials involved) is just too high.

Moving away from CMOS would take so much upfront capital that no one would be willing to take the risk.

Could you take an educated guess on what that cist might be?
 

Locuza

Member
I wonder if AMD can come up with a crazy creative solution or just follow Intel.

Well, small steps, first release AMD Zen.
Zen at least is not crazy in regards of single-thread performance.
2 AGUs, 128-Bit FP-Pipes, one FMA-Output per cycle.
Zen is a small core and should be quite efficient.
 

Three

Member
I have some exposure to the industry, and suffice it to say, CMOS (on silicon) is the elephant no one can dislodge.

To give an analogy: there was speculation that the industry was going to move onto X-Ray lithography over a decade ago because foundries were hitting the limits of UV. And they're still on UV, because the cost of retooling (and the special materials involved) is just too high.

Moving away from CMOS would take so much upfront capital that no one would be willing to take the risk.

I don't think it was mere speculation there was an actual real drive. I attended a lot of x-ray conferences where intel backed researchers presented their work. That was about 6 years ago though so no idea how far it got.
 

roytheone

Member
I still use my i7-870. Sure, I am getting cpu bottlenecked in a lot of games, but if I want to upgrade my cpu I also will have to upgrade my motherboard and that sound like a lot of work :(
 

Sulik2

Member
The onus now is on developers to take greater advantage of multicores and threads. Sixteen core cpus with 32 threads total even at slower speeds should be able to do easy more computing then a four core if you program it right. Sony was a decade too early with the cell.
 

LiquidMetal14

hide your water-based mammals
Food, CPU's will become more efficient. These things are really fast these days.

I'll take more cores though. 8 or 12 seems to be a good spot.
 

Three

Member
Could you take an educated guess on what that cist might be?

By cist do you mean the greatest challenges?

The problem with xray lithography compared to UV

1) Creating an X ray source is more difficult and bulky compared to UV. A lot more but still doable.
2) The optics are not simple mirrors or lenses. Xrays like to go through things. The optics are multilayer mirrors, zone plates so on and so forth. The optics and their resolution becomes difficult too and they are heavily being researched at the moment.
3) When you create smaller and smaller structures on silicon defects become more and more of a problem.

If I get time I can post some presentations though they are probably very dated by now.
 

Teletraan1

Banned
Really wish I would have gone with an i7 over that i5 all those years ago. Then I wouldn't even consider upgrading until there was some huge leap.
 
This is really in reference to single core speed, die size, and ops per cycle. Of course they can improve architecture, power consumption an introduce more cores.

Sure your existing CPU may be more than capable of carrying you through the next couple of years, but we're seeing more and more applications and games built with scalable multithreading, and make use of processors with more than four cores and beyond. This will be even more evident as games make use of.DX12 and Vulkan.

The division, which is a dx11 game, is an example of excellent cpu usage and scaling. utilization is amongst the highest I've seen, near 60% in most cases. Additionally the game scales very well beyond 2 and 4 core setups so there is a real world benefit to gaming with a 6,8 and soon to be 10 and 12 core CPU.

Gone are the days where 6 and 8 core cpus being deemed overkill for games.
 

McHuj

Member
I wish there was a full transcript of his talk. While the speeds of the transistors may reduce, there are many ways to increase absolute performance so I'm not too worried about it. This will probably lead to interesting new CPU micro-architectures, not just more cores.

Until that time comes, we still have a 10nm, 7nm, and maybe a 5nm node of traditional shrinks (although each seems to be getting less gains).
 

Woo-Fu

Banned
It has been over for awhile. That is what has been driving multi-core designs, them hitting their heads against a very real ceiling in terms of individual core speeds.

Not to say that multi-core doesn't have some clear benefits, of course.

A fully acknowledged plateau in consumer-level processors might actually be a good thing, at least in terms of encouraging the development of better software instead of just throwing more hardware at it when your first build doesn't perform so hot. :p
 

TechJunk

Member
Gaming PCs are crazy environmentally unfriendly as it is. It shouldn't come as a surprise if governments around the world eventually limit the maximum power consumption of CPUs and GPUs. As our resources are dwindling, they have started to impose similar restrictions for all sorts of devices already.

Agreed, I'm all for the power, but I kinda like the idea that a gaming PC could also be energy efficient. Especially since mine doubles as a media server so it's pretty much running 24/7
 

NeoBob688

Member
CPU performance yields have already been so disappointing in the last 7 years. I recently upgraded from an i7-930 (2008) to an i7-6700K (2015). Here is a table of my single-core performance for each processor, at stock and overclocked.

Geekbench 32-bit single-core score is shown if people are curious; higher is better.

i7-930 @ 2.8 GhZ
1886

i7-930 @ 3.8 GhZ
2403

i7-6700K @ 4.0 GhZ
4297

i7-6700K @ 4.7 GhZ
4981

In principle I get a bit more than twice the performance per core.

It is a good thing that the 6700k is a pretty good overclocker; the situation would be significantly less favorable otherwise because the 930 is such an overclocking beast.

While these gains are significant, they are hardly amazing considering the large time span between when these processors were released.
 

SmokedMeat

Gamer™
Fine by me!

In the back of my mind I was dreading having to eventually buy a new CPU that would probably use a new socket, thus requiring a new motherboard.

More money in my pocket.
 

LilJoka

Member
Intel know they can do this, because in the same period where they slump in performance, AMD will catch up, and from these years of R&D they will probably unleash something spectacular. They can take this risk because AMD are so far behind, but i dont see them appearing with no results.
 

Atolm

Member
The real problem is that there's literally zero competition on the desktop CPU market. I just checked and an i7 6700k is 375€ at my Amazon branch. The 5820k is 409€. Yikes!. And it isn't much better with graphic cards, Nvidia can charge as much as they want and they do it,a 970 is 350€

At this rate building a decent rig will be prohibitively expensive again as it was in 2000.
 
Top Bottom