• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel: future CPUs to be slower but more efficient

I'd recommend you upgrade that sooner rather than later.

I have a 6350 paired with a GTX 960 in one of my rigs. Other than Planetfall 2 I haven't run into any games that give it much trouble. All of the modern games I've tried run 30-60fps on high or ultra settings. It's obviously not as fast as my i5 + GTX 970 rig, but I dont really notice any difference at 1080p on the big screen. So I use the Intel rig on a 1440p monitor. To each their own, but I'd say a 6350 is just fine for now. There's really no reason to bother with the time or expense of upgrading until we see what Zen looks like.
 

KKRT00

Member
you have a vendetta against those with 2500k? sorry to say the 2500k is still a beast. I run it oc'd to 4.5 and a gtx 980 with everything on best settings. flawless

Vendetta? No, i just have one and see its limitation to degree that i cant wait to upgrade, but i'm waiting for new Intel's line and information about next gen GPUs from Nvidia and AMD. I still dont know if there will be any changes for a support of them on motherboards.
Saying that this CPU is still a future proof solution for high performance, when it actually limited my GPU in many games already, is just false.
 
Just reminded me. Does the 2500K run on motherboards that have a max RAM capacity of 8GB? That is a tiny amount these days and will be another bottleneck soon enough.

I think my asus P67 has something bricked inside.
either that or GSkill has shitty memory sticks(tried 3 different set of sticks)
 

Doctre81

Member
Sounds like most of you completely missed the point or am I missing something here? What they mean if that the cpu's...while being clocked lower will still perform better because they are more efficient. Kinda like how the WiiU's cpu and cores are clocked WAY lower than that of the 360's yet can run those same games better than the 360 can.

All they are saying is that you shouldn't use cpu clockspeeds as the measure of performance but honestly it has been like that for some time now. They are still going to come out with cpu's that outperform what you already have...just just may not be clocked as high. They are trying to cut down on heat.
 

Skinpop

Member
That will cost you an arm and a leg... :p

yeah, and that's why they aren't doing it.

They are tight integrated.
guess we are getting there then. still, the problem is intel :/

ultimately, as a programmer I want to get to a place where I don't have to worry about the separation of cpu/gpu - i want to treat it like one powerful thing on a low level. Since more and more gpgpu stuff is introduced every generation I think it's only a matter of time before that happens.
 

BeforeU

Oft hope is born when all is forlorn.
They don't have to innovative, they have a monopoly.

Making it faster is the only way to innovate?

Being efficient, running on low power and producing same computational power sounds innovate to me.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Vendetta? No, i just have one and see its limitation to degree that i cant wait to upgrade, but i'm waiting for new Intel's line and information about next gen GPUs from Nvidia and AMD. I still dont know if there will be any changes for a support of them on motherboards.
Saying that this CPU is still a future proof solution for high performance, when it actually limited my GPU in many games already, is just false.
Are you aiming for more cores in the new CPU? Otherwise I can't see how a 2500K could possibly be liming you in per-thread performance.
 

KKRT00

Member
Are you aiming for more cores in the new CPU? Otherwise I can't see how a 2500K could possibly be liming you in per-thread performance.

Of course i am. I'm building new PC for Star Citizen. I was thinking about 5820k, but with new Broadwell-E announced, i'm still waiting. 6c/12t is minimum i will buy.

And actually having more threads help tremulously right now. 2600k is much better CPU than 2500k currently, where in the past they were equal 95% of the time.
 
In terms of gaming performance, does CPU overclocking provide a noticeable benefit, considering most of the games I've played seem to utilize the GPU a lot more, with CPU usage tending to never reach even 70-80% usage in the more demanding games? Currently got an i7 4770K, using the 3.9GHz turbo boost (previously had it overclocked to 4.2, but had to remove that because it would cause BSODs when creating 3D renders).
 

LilJoka

Member
In terms of gaming performance, does CPU overclocking provide a noticeable benefit, considering most of the games I've played seem to utilize the GPU a lot more, with CPU usage tending to never reach even 70-80% usage in the more demanding games? Currently got an i7 4770K, using the 3.9GHz turbo boost (previously had it overclocked to 4.2, but had to remove that because it would cause BSODs when creating 3D renders).

That oc is not much, so very likely not setup properly, even though it should be easily attainable. It can effect minimum fps and frame time consistency even in GPU bottlenecked situations.
 
In terms of gaming performance, does CPU overclocking provide a noticeable benefit, considering most of the games I've played seem to utilize the GPU a lot more, with CPU usage tending to never reach even 70-80% usage in the more demanding games? Currently got an i7 4770K, using the 3.9GHz turbo boost (previously had it overclocked to 4.2, but had to remove that because it would cause BSODs when creating 3D renders).

Going to depend on the game, you can look up CPU benchmarks, nearly all of them are going to have gaming benchmarks mixed in with the 'harder' (video encoding, etc) ones.
 
Actually I find the HPC-APU concept pretty cool.
Full blown CPU + GPU on a package, linked through strong coherent fabric.
Simply fixing the bad latency and bandwidth of PCIe + guarantee coherent data.
Most PC gamers don't find APUs all that exciting though...

What's funny is that consoles did first what Intel plans to do in the future. 8 low-power, low IPC cores. It basically makes multithreading a necessity. Who's going to mock the Jaguar now? ;)

ultimately, as a programmer I want to get to a place where I don't have to worry about the separation of cpu/gpu - i want to treat it like one powerful thing on a low level. Since more and more gpgpu stuff is introduced every generation I think it's only a matter of time before that happens.
Hmmm, you remind me of the integrated FPU paradigm shift.

It was back in the 386/486 era when FPUs were integrated in the same die as the CPU. Before that (80286) they were separate chips.

I guess it makes sense to do the same with the GPU, so that GPGPU can become as ubiquitous as x87/SSE2 are. :)

A lot of you don't remember the Pentium days do ya? They hit a wall speed wise and had to get that speed through architecture. Same thing here, chips might hit a wall cycle wise but we'll see more instructions per cycle if need be.
Not really. Moore's law was live and kicking back in 1996.

I guess you're talking about the P5 (Pentium MMX) -> P6 (Pentium II) IPC increase.
 

slapnuts

Junior Member
my 3770k is still blazing along fine ..matched with a Gigabyte R9 390 G1 8gb card...perfect match for my cpu
 

AmyS

Member
Actually I find the HPC-APU concept pretty cool.
Full blown CPU + GPU on a package, linked through strong coherent fabric.
Simply fixing the bad latency and bandwidth of PCIe + guarantee coherent data.

Yep. Here's part of an article from semiaccurate dated January 21st:

Finally we have AMD’s most exotic attack on the datacenter: a high-performance server APU. AMD’s HSA initiative has always had the HPC market written all over it. It’s good to see that AMD finally believes that HSA has matured to the point where it’s no longer just a consumer technology but a viable memory architecture for HPC applications. AMD lists multi-teraflops for HPC and Workstation as a characteristic of this APU implying that this chip will at a minimum offer 2 TFLOP/s of single precision compute. AMD’s Kaveri offers just under a TFLOP of compute performance so with AMD’s server chip we’re looking at a significant departure from the kinds of APU configurations we’re use to seeing.

Whether this means that AMD’s designing a monolithic large die-size APU or will be packaging a pile of its consumer APU dies together into one chip remains to be seen. Although there are rational arguments for both options. What AMD means when they mention a transformational memory architecture and scale-up graphics performance is an open question. At the very bottom of the slide AMD also notes that its open to working on semi-custom projects for datacenter chips. It will be interesting to see if any of AMD’s semi-custom efforts make it into products outside the gaming console market.

With AMD’s new roadmap we have public confirmation of a lot little bits that we were already aware of. We also have a clear sign coming on the heels of the launch of AMD’s A1100 series chips that the company is committed to building new products for the datacenter market. The next two years are going to be nothing if not an interesting time as AMD tries to claw its way back to profitability with a variety of genuinely new products.S|A

http://semiaccurate.com/2016/01/21/38060/

I read the bolded part as, AMD's semi-custom Zen-based APUs are assumed to be a given for future consoles.
 
That oc is not much, so very likely not setup properly, even though it should be easily attainable. It can effect minimum fps and frame time consistency even in GPU bottlenecked situations.

Had the overclock for about 2 years (it was setup by the company that sold me the PC in the first place). Since removing it, haven't really noticed much in the way of degradation in game performance.
 
Did anyone else stop swapping CPUs yearly once the Q6600 arrived? Ever since then I'm on a 4 year cycle.


I recently upgraded my 860 for a 6700k. I've had the 860 for 6 or 7 years I think? bought it when it was brand new. That thing was a beast.
I guess this one might last me just as long.
 
I guess future smug PC gamers will no longer brag about their games running better than consoles. Instead they will brag about their games consuming less power.
 

Guess Who

Banned
I'm hoping Apple would bite :)

Like for iPhones, they may not need the best tech. They just need to custom it for their specific needs.

No chance. On desktop they've got no reason to switch from Intel and on mobile they've got the best mobile SoC design team in the world in-house (and if they ever do switch from Intel on the Mac side, it'll probably also be to in-house chips).
 

AmyS

Member
Graphene processors need to get here ASAP.

Just think, there will probably be just one more generation of consoles that use traditional silicon chips. Whether they're on 10nm or 7nm, there really wouldn't be anywhere else to go in silicon after that.
 
Apple's CPU design team has been embarrassing the rest of the industry since the A7.

Apple's CPU team has made a lot of interesting choices when it comes to what they put in their CPUs. They essentially designed a super wide IPC power house that targets a lower frequency. They only put 2 cores on a chip, so they optimize for single threaded performance (not a bad thing, but they can get smoked in MP tests). I mean seriously, what end user needs half a dozen FMACs in their mobile CPU?
 

prag16

Banned
For the few here touting massive gains with newer hardware... what are we talking exactly? Maybe double the score on some benchmarks that are heavily tailored to as many cores as possible? (e.g. 2500k vs 5820k or something along those lines)

The 2500k is what, nearing 5 years old now? So 2-3x leap in 5 years? This is what we're considering massive gains now?'

Now planning on moving off of 1680x1050 for a while, I guess I can expect to ride my 2550k/970 combo for a damn long time yet.
 
I guess future smug PC gamers will no longer brag about their games running better than consoles. Instead they will brag about their games consuming less power.

Besides, even when the next gen of consoles can officially do 4K, most publishers on consoles will go for Native 1080p & 60 FPS to save on costs when developing titles.
 

Dice

Pokémon Parentage Conspiracy Theorist
What about those light-based processors I heard about? That isn't an actually viable path?
 

orochi91

Member
Besides, even when the next gen of consoles can officially do 4K, most publishers on consoles will go for Native 1080p & 60 FPS to save on costs when developing titles.

Sounds good to me.

I've read that 1080p upscales well to 4K, obviously not as good as native 4K, but it should still look great.
 
Intel is more concerned about battery life than horsepower. The pc is dying for application based with more being pushed to the cloud, They are following the money which is mobile.

Im not saying pcs are going anywhere but growth is in mobile platforms
 
I guess future smug PC gamers will no longer brag about their games running better than consoles. Instead they will brag about their games consuming less power.

Where do you think console architecture is based from? Console makers arent going to make their cpus from the ground up anymore.
 

Trogdor1123

Gold Member
Mobo-specific but mostly the same stuff, gotta change some settings to work better with a multiplier change.
If you have any decent brand mobo it should be relatively pain-free.

I have a 2500K, Gigabyte Z68A-dh3-b3, and 8 gigs of gskill pc3-10700.

Any tips?
 

RE4PRR

Member
Hard to say whether your improved gaming performance is due to CPU or something else. At the same time, you changed your motherboard, RAM and possibly did a new system install. Without having a clear A/B comparison and benchmarks, your perceived improvement could thus be due to other factors or just be pure placebo effect.

You can see in recent benchmarks having an i7 over i5 produces good gains.
 

DonMigs85

Member
Apple's CPU team has made a lot of interesting choices when it comes to what they put in their CPUs. They essentially designed a super wide IPC power house that targets a lower frequency. They only put 2 cores on a chip, so they optimize for single threaded performance (not a bad thing, but they can get smoked in MP tests). I mean seriously, what end user needs half a dozen FMACs in their mobile CPU?

They do have triple core variants as well, and iPad Pro is quad core if I remember right.
 

Damaniel

Banned
I just bought a 5820K this afternoon as the basis for my new PC after building a few at work to run some of our builds (sorry, Skylake). At the rate things are going, that 5820K might end up being the last CPU I ever buy.
 

Ptaaty

Member
6700k is going to be faster that a 2500k - but you have to really dig to find games that you will ever see a difference...I mean dropping below 60fps because of CPU.

Techspot usually does a CPU bench in their reviews. Rise of Tomb Raider is apparently surprising heavy on CPU. http://www.techspot.com/review/1128-rise-of-the-tomb-raider-benchmarks/page5.html

Even here, looks like the normal OC of 4.3-4.6GHz on a 2500k would have it right neck and neck. Other newer games (Fallout 4, Battlefront, etc) are all way over 60fps min even at stock speeds.

GTAV shows the most difference I could find in a couple minutes...and this was the quote in the article: "As you can see, GTA V is quite CPU bound.........The Core i5-4690K was just 3fps slower than the Core i7-5960X, so it is safe to say investing in a Core i7 processor for GTA V isn't money well spent."

There is a reason 2500k owners are like vegans....been using a $200 or processor for about 5 years and still no gaming reason to upgrade. (for nearly everyone)
 
I guess future smug PC gamers will no longer brag about their games running better than consoles. Instead they will brag about their games consuming less power.

PC superior technology has been dead for about 6 years, it's a myth to keep PC gamers feeling like the $1000 spent is worth it..
 

RE4PRR

Member
6700k is going to be faster that a 2500k - but you have to really dig to find games that you will ever see a difference...I mean dropping below 60fps because of CPU.

Techspot usually does a CPU bench in their reviews. Rise of Tomb Raider is apparently surprising heavy on CPU. http://www.techspot.com/review/1128-rise-of-the-tomb-raider-benchmarks/page5.html

Even here, looks like the normal OC of 4.3-4.6GHz on a 2500k would have it right neck and neck. Other newer games (Fallout 4, Battlefront, etc) are all way over 60fps min even at stock speeds.

GTAV shows the most difference I could find in a couple minutes...and this was the quote in the article: "As you can see, GTA V is quite CPU bound.........The Core i5-4690K was just 3fps slower than the Core i7-5960X, so it is safe to say investing in a Core i7 processor for GTA V isn't money well spent."

There is a reason 2500k owners are like vegans....been using a $200 or processor for about 5 years and still no gaming reason to upgrade. (for nearly everyone)

Those are at 1080p though, the difference becomes much more apparent the higher the resolution. If you're only on 1080p then I see no reason to change, but for those of us on 1440p, 3440x1440 or 4K you need to maximize gains and reduce as much bottleneck as possible to not waste any fps.
 
Top Bottom