• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

I think i'm happy that CPU technology is hitting a wall

Yes, it's running at 4.5GHz.
There's nothing odd about it - the CPU is maxed out and bottlenecking the GPU.
I can set the game to Ultra settings at 2880x1620 and still hit 43 FPS, but it will not run any faster than that due to the CPU.
DX12 performance is exactly the same. On the 1070 it actually runs okay in DX12 mode, on my 960 it caused the game to stutter badly.

It really bothers me that people keep repeating that there's no need for faster CPUs today, or claiming that their 2500K is just fine when paired with a fast GPU.
Average framerates will appear to be fine but your minimum framerates will be below 60 and games will stutter badly.

2500k is done. You might get away with a 2600k 4.5ghz.
3770k is the minimum. Overclocked preferred.
 
You'll find out that most games, even the newest of ones barely use all your cpu power.
So the performance difference between 2nd,3rd and 4th generation intel is mostly neglect-able. Great if you buy things secondhand, because well I got 4th gen cpu and Xeon for the price of a new skylake and they have long lifespans, the boards however..

Overclocking your cpu to reach high Ghz often won't net any notable results if FPS.
What you looking for if you truly want a FPS increase is a skylake and getting the fastest DDR4 memory you can get your hands on. Ironically the memory speed now dictates how fast your game will run when you already have a high end card.

It makes a significant difference in games like Witcher3.
 
2500k is done. You might get away with a 2600k 4.5ghz.
3770k is the minimum. Overclocked preferred.
Well that was my point.
There are so many people here saying that a 2500K is still competitive, but it's really not.
Unfortunately the latest Intel CPUs are not so much faster that they aren't bottlenecking cards either.
Even a heavily overclocked 6700K will bottleneck a Titan XP, and probably a 1080 in some games as well. Maybe even a 1070.

Ivy Bridge (3xxx) is not much faster than Sandy Bridge.
In most cases the CPU will not overclock as high, so the higher IPC is offset by a lower clockspeed.
I do wish that I had bought an i7 rather than an i5 now. It didn't matter for games 6 years ago but newer games definitely benefit from hyperthreading.

My plan is to wait for Zen or a 7900K next year, because we seem to have hit a limit on clockspeed/IPC and the only option now is to add more cores.
The problem right now is that by choosing more cores, you're giving up per-core performance.
Since many games don't take advantage of more than 4 cores today, it means that buying a 6 or 8 core CPU is giving up performance vs a quad-core chip.
I'm hoping that will be less of an issue with the next generation of CPUs.

It's just really disappointing to see so many people here praising the fact that CPU performance has stagnated as if that was a good thing.
 
What you looking for if you truly want a FPS increase is a skylake and getting the fastest DDR4 memory you can get your hands on. Ironically the memory speed now dictates how fast your game will run when you already have a high end card.

It makes a significant difference in games like Witcher3.

This is interesting. I have a 2600k at stock, 16GB 1600mhz DDR3 RAM, and a GTX 970. At 1080p The Witcher 3 is STILL GPU limited. If I drop below 60fps it's because my GPU has hit 100% load. Would I really see any gains from a Skylake and new RAM? Or is this only for when you're GPU limited, like trying to run TW3 at 1440p-4k?

I'm worried reading reviews of the 7700k that there will finally be a big enough backlash that Intel will get their collective asses together and design something truly revolutionary like Sandy Bridge again. Especially with competition from AMD's Zen next year. As much as I want to build a new PC soon, I'm terrified of buying a CPU like Kaby or Cannon if they're continuing this trend of tiny incremental upgrades (or no upgrade at all in Kaby Lake's case). That's not something I want to financially support, and at this stage I feel like a decent CPU advancement is inevitable soon, just maybe not from Intel.

The problem right now is that by choosing more cores, you're giving up per-core performance.
Since many games don't take advantage of more than 4 cores today, it means that buying a 6 or 8 core CPU is giving up performance vs a quad-core chip.

Yeah, I'm still playing "modern" games that barely make use of more than a single core. The Elder Scrolls Online is almost entirely single core, and it really angers me. There's literally nothing I can do to improve performance with any upgrades. I can only boost resolution with a new GPU. Performance is entirely up to the devs, and they're clearly either incompetent, don't care, or both.
 
Still running my 2600k at 4Ghz which has lets me play almost all games at Ultra 1080/60fps best purchase i've made in 2012.

Was thinking about upgrading but since i don't do 4k i don't really see the point.
Just a pet peeve of mine, I hate it when people conflate 4K rendering with CPU power. It's almost entirely GPU dependent. Where a faster CPU comes handy is in higher framerates. A 2600k won't run every game today at 60 unless overclocked, and even then, the actual 'high end' is in higher framerates which a 2500k simply won't do with modern games.
 
Yes, it's running at 4.5GHz.
There's nothing odd about it - the CPU is maxed out and bottlenecking the GPU.
I can set the game to Ultra settings at 2880x1620 and still hit 43 FPS, but it will not run any faster than that due to the CPU.
DX12 performance is exactly the same. On the 1070 it actually runs okay in DX12 mode, on my 960 it caused the game to stutter badly.

It really bothers me that people keep repeating that there's no need for faster CPUs today, or claiming that their 2500K is just fine when paired with a fast GPU.
Average framerates will appear to be fine but your minimum framerates will be below 60 and games will stutter badly.

what speed does your memory have?
 
My GTX 1070 arrived today.
I don't know how people can say that there's no need to upgrade from a 2500K, or that we don't need faster CPUs.
I can keep turning up the graphics, but I can't get more than 43.7 FPS here.

Deus Ex: Mankind Divided:

Its probably the game that is at fault, watched a few videos on youtube, and even on i7 6700 with gtx1070 it was not that far off the 40fps at 1440P..

edit: another benchmark on high settings, where even a i3 had better average framerate than the i7 at 1440P around 2:26 into the video https://www.youtube.com/watch?v=82EsZbj7sPc

So try some other game instead, and see if you still dont have any performance gain.
 
The lack of competition is causing this market to stagnate. Let's hope Zen can change this.

What people want here for games is an absolute performance increase. Zen will not bring this. What it will do is bring better performance for a particular price bracket. Intel parts will still be faster as sold and unlocked parts will probably be able to reach higher frequencies than anything from AMD.

Sadly, there there wont be any IPC improvements from Intel until Ice Lake ~2018. Our niche, performance in desktop parts, is no longer being served by Intel. Instead they concentrate on mobile and server markets, where lower power use is a real benefit.
 
What people want here for games is an absolute performance increase. Zen will not bring this. What it will do is bring better performance for a particular price bracket. Intel parts will still be faster as sold and unlocked parts will probably be able to reach higher frequencies than anything from AMD.

Sadly, there there wont be any IPC improvements from Intel until Ice Lake ~2018. Our niche, performance in desktop parts, is no longer being served by Intel. Instead they concentrate on mobile and server markets, where lower power use is a real benefit.

I also don't think it's helping that almost all emphasis on visuals right now is resolution. Especially with consoles chasing 4k so hard...
 
what speed does your memory have?
DDR3 1866MHz. It's not a memory bandwidth or latency issue, the game is simply maxing out the CPU.
All four cores are hitting 100% usage.

As I said in my previous posts, CPU benchmarks need to be looking at minimum framerates, not averages.
It doesn't matter how fast your GPU is if your CPU can't prepare frames quickly enough.

Its probably the game that is at fault, watched a few videos on youtube, and even on i7 6700 with gtx1070 it was not that far off the 40fps at 1440P.
Everyone is so quick to blame the game/engine/developers when the CPU requirements are high.

The reason that a 6700K won't outperform it significantly is because silicon has hit a limit.
Sandy Bridge was the first generation of CPU from Intel that could reliably hit 4.5GHz, and nearly every CPU since is still stuck around 4.5GHz when you keep the voltages within specifications.
If you put more voltage into them than recommended, cutting short the life of your processor, you can maybe hit 4.8-5.0GHz.

IPC (performance per clock) has not improved dramatically since the Sandy Bridge CPUs were introduced - that was the last big leap in performance.
The new CPUs are faster, but not significantly faster the way that CPU upgrades used to be.

I would expect better framerates on a 6700K/7700K but those may still not be enough to keep the minimum above 60 FPS.
That's why we need more cores, and game engines written to take advantage of them.
Intel just can't make quad-cores much faster than they are now.
This is very apparent now with the 7700K which is only 1% faster than the 6700K at the same clockspeed.
All of the improvements are to the platform instead of the CPU itself.

That's why all of the improvements Intel makes are now about efficiency and not performance.
It's not that they're lazy because AMD hasn't been competitive.
Intel's biggest competitor is themselves.
They don't want people sticking with 6+ year-old CPUs.

another benchmark on high settings, where even a i3 had better average framerate than the i7 at 1440P around 2:26 into the video https://www.youtube.com/watch?v=82EsZbj7sPc
That benchmark is GPU-limited, not a test of CPU performance.
You can see that the GPU is sitting at 99% load throughout the whole test.
This is a big part of the problem - a lot of people posting "benchmarks" between products don't know what they are doing.
When you're comparing CPUs like that, you need to turn all the graphical options down as low as they can go.
You need to take the GPU out of the equation so that the CPU is the limiting factor.

That's why my test showed 43.7 FPS with both the GTX 960 and the GTX 1070.
In both tests, the GPU load was well below 100% and the limiting factor was the CPU.
That's why I was able to turn the graphics up to Ultra and still maintain that 43 FPS with the GTX 1070 - because my test was CPU-limited and not GPU-limited in any way.
If I had turned the graphics options up even higher, then my framerate would have started to drop below 43 FPS due to being GPU-limited.
But nothing is able to increase the minimum framerate above 43 FPS other than replacing the CPU.
 
CPUs not improving as quickly is good for PCgamers, but it's not so great for things like basic research and improvements in e-learning. Overall I'm not happy.
 
To be honest I'm happy about this too, my OC'ed 3570k gets a bit long in the tooth by now as I noticed in Battlefield 1 for example paired with a GTX 1070 and I'm looking to upgrade next year but yeah... with Kaby Lake being a dud upgrade-wise I might aswell jump to Skylake. The thing is though, I would like to upgrade to a true 6-core CPU but I'm not paying 414€ for a CPU alone so I hope they will become a bit cheaper atleast with time :/
 
I'm pretty okay with CPU tech not advancing as much, but honestly, it seems like the bigger driver for upgrading is motherboards now. The mobo makers are integrating lots more features into their products now, like built-in WiFi cards, and that has a good deal of value in certain situations.

That said, Microsoft tying Windows keys to your hardware config helps make the whole "CPUs aren't advancing much" thing seem like an even better deal though.

Is there a way around this? Having to buy a new operating system if I build a new PC feels like a tax at this point.
 
was just thinking about this yesterday while browsing eBay. I wanted to get my cousin a light gaming PC, but I really didn't want to spend over $150 tops. He has an old dual core PC that he is using now along with an old ancient Nvidia 240 and running Windows XP, so I was thinking if I could just get him a new tower or motherboard or something. Then started looking up older workstations.

To make a long story short, I picked up an HPZ400 workstation with 6GB ram, 475 watt 80+ PSU, 250GB HDD, Win7 PRO, and a Xeon 3520 for just over $100. Then picked up a used GTX 750 SC for $65.

So that's $165 for something he can play older games on. Possibly even some new ones on medium settings.
 
Linux.

Nah, keys are tied to the MS account. Worst case scenario you just have to call MS and answer a bot. Worked fine for me.

No, win 10 keys are tied to the hardware.
After some number of hardware changes (a strike based system with different weightings per component), the key will become invalid. Sometimes a call to MS can sort this out.

For win 7, just use the same key and call MS - tell them the HDD broke and they give a code to reactivate it.
 
Upgraded this year to a Q1 2010 processor.. lol. The Xeon X5650 on the LGA1366 chipset, retiring my 7 year old i7 920.

This thing performs like a beast (trades blows with modern i7s) when overclocked to 4+GHz. The best part is that I got it for $80cdn. I hope this setup can stay around for a few more years.
 
Upgraded this year to a Q1 2010 processor.. lol. The Xeon X5650 on the LGA1366 chipset, retiring my 7 year old i7 920.

This thing performs like a beast (trades blows with modern i7s) when overclocked to 4+GHz. The best part is that I got it for $80cdn. I hope this setup can stay around for a few more years.

Running a W3690 here but need to replace the cooler. It hits 4GHz, but gets a bit too hot due to the junk stock cooler it has. But even at stock, with my 1070, it runs anything and everything I throw at it. Some of the Xeon chips are beasts for the money.
 
Well after reading a few Kaby Lake reviews, I'm glad I went with the i7 6700k around 6 months ago now, that extra long wait was certainly not worth it, for such a small difference.
 
Nobody who cares about framepacing can honestly agree with this I think.

Also "I'm happy I don'thave to buy new stuff" is a pretty bad reason for not wanting technological progress in my opinion.
 
Well after reading a few Kaby Lake reviews, I'm glad I went with the i7 6700k around 6 months ago now, that extra long wait was certainly not worth it, for such a small difference.

It never has been between each gen of 2xxx-6xxx cpus. Its just worse this time.
The only reason people had any reason to wait was for new platform features - which imo are still underwhelming.

Running a W3690 here but need to replace the cooler. It hits 4GHz, but gets a bit too hot due to the junk stock cooler it has. But even at stock, with my 1070, it runs anything and everything I throw at it. Some of the Xeon chips are beasts for the money.

Its equivalent was the i7 980x. These old chips are still pretty good if you overclock heavily. They might start to struggle with a GTX 1070/1080. And forget about GTX 9xx SLI on these CPUs.

As people have said, its the minimum fps and frame pacing that suffers even with these pretty good old school CPUs.
 
Not wanting technology to advance because of your own personal agenda is terrible
The problem with CPUs is that they can render your whole system useless when they get obsolete and the problem was that they were becoming obsolete way too fast. Now it's more manageable, they still advance just at a slower pace, they are not static. And you can focus on the GPU, which is easier to upgrade as well (just replace the old one, no new motherboard or a fresh Windows install required). Being forced to replace your base system every 5 years instead of 2 isn't a bad thing.
 
No, win 10 keys are tied to the hardware.
After some number of hardware changes (a strike based system with different weightings per component), the key will become invalid. Sometimes a call to MS can sort this out.

For win 7, just use the same key and call MS - tell them the HDD broke and they give a code to reactivate it.

since anniversary update W10 key can be attached to your MS account, if you enable it.
 
All I'll say in this thread is that soome people in another forum while discussing thread brought up an interesting point. Intel's advantage over AMD isn't simply better IPC thanks to clock speed. They have better IPC because of more sophisticated branch predictions and a more refined usage of L1 and L2 cache. I've always wished it was possible to upgrade cache because it's the most expensive part of a CPU but would offer the largest improvements if you're willing to pay for it and either company was willing to make that type of upgrade possible.
 
That benchmark is GPU-limited, not a test of CPU performance.
You can see that the GPU is sitting at 99% load throughout the whole test.
This is a big part of the problem - a lot of people posting "benchmarks" between products don't know what they are doing.
When you're comparing CPUs like that, you need to turn all the graphical options down as low as they can go.
You need to take the GPU out of the equation so that the CPU is the limiting factor.

That's why my test showed 43.7 FPS with both the GTX 960 and the GTX 1070.
In both tests, the GPU load was well below 100% and the limiting factor was the CPU.
That's why I was able to turn the graphics up to Ultra and still maintain that 43 FPS with the GTX 1070 - because my test was CPU-limited and not GPU-limited in any way.
If I had turned the graphics options up even higher, then my framerate would have started to drop below 43 FPS due to being GPU-limited.
But nothing is able to increase the minimum framerate above 43 FPS other than replacing the CPU.

I found this so strange that the cpu usuage varies since I found a i5 2500k test were the cpu was at almost 100% as yours.
After have seen a video of it running pretty good on a AMD 8 core cpu, I guess this game maybe need more than 4 threads? and thats why it runs poor on your 4 core i5, and pretty good on the i7 cpus in the videos Ive watched.
Still all plattforms get bad framerates like under 60, but the cpu is under 60% on all of these.

edit: a bit like battlefield 1 on a dual core pentium, with no hyperthreading.. it runs like half the speed as a similar i3 with hyperthreading.
 
Lack of progress is also down to two other things
- push for low power over raw compute performance meaning more emphasis on 15W chips for ultrabooks etc
- emphasis on integrated graphics using a lot of die space.

Even on current process node die sizes there is length of space to have 6-8 CPU cores if they wanted. But they'd be hotter, need more power and they'd need to drop/reduce the integrated graphics. Basically they'd be enthusiast chips. And Intel does make those in the -E range (and charges you crazy amounts accordingly)

Intel could release a 6/8 core CPU at affordable prices tomorrow if it wanted to. It doesn't want to. Maybe AMD and Zen will be good enough to push Intel to do more in that area

I agree, but I'm talking about beyond that. 8-10 years.
 
Just a pet peeve of mine, I hate it when people conflate 4K rendering with CPU power. It's almost entirely GPU dependent. Where a faster CPU comes handy is in higher framerates. A 2600k won't run every game today at 60 unless overclocked, and even then, the actual 'high end' is in higher framerates which a 2500k simply won't do with modern games.
This is pretty interesting.. If switch is close to xbone one specs and has a CPU that is significantly better than xbone, I wonder how the multiplay version will have more stable framerate.. Who knows about its RaaM though.

Then again, if CPUs have a huge factor on framerate, how come ps4 og multiplay games have more framerate than xbone, even though the latter has slightly better CPU?
 
Have a i2500K from 2011. It is still a decent CPU but I agree that is now time to move on. For the games I play it works still great.
I do want do change the PC all 5 or 6 years just because some USB ports don't work anymore, there are new standards like USB 3.1 there is this M2 for SSDs, DDR4 and so on. And the case could also be changed. Standards change and get improved.

I don't upgrade CPUs while keeping the MB as it is a pita, IMO. All 5 years a new CPU+MB+RAM is pretty ok. I'd just need a new one with 32GB RAM for some virtual machines.

So right now I wait for what AMD has to offer with ZEN. The 7700k doesn't seem to be really better than the 6700k from the first impressions - we'll see. When 7700k and Zen hit the 670k might also become cheaper.
If you use your PC besides gaming (it is my main surfing device and I often use handbrake or video editing and do some programming). PC gaming is super cheap in that case.

I don't view it like some people do that compare a 1200€ gaming PC to a console.
I'll buy the new PC anyways - gaming related costs are only GPU related.

I keep GPU upgrades asynchronous. They can get switched out easily and there are more and faster jumps.
I Switch when there is a good deal. The 970 was pretty decent back then and cheaper than a PS4 or Xbox One. You can sell also your old card - while console selling forces you to delete accounts, copy stuff and so on.

Every 5 years a new MB, CPU+RAM and Case is pretty ok IMO. Especially if you use your PC for other things than gaming. That is a short console cycle. A TV (if you use it often) every 5 years is also ok.

So I don't see this investment as gaming related at all. I want a new Pc every 5 years.
The only thing that is related to gaming hardware investment are GPU upgrades. And if you don't aim for the best out there, GPUs are often cheaper than consoles.
But when you buy a new PC - get a decent one. It is the plattform for the next 5 years.
I2500k was a great deal back then. I'll keep my 970, as it is currently still good enough for me and upgrade the GPU later.

As I don't register Windows - I might have to buy it new. But you can get it super cheap.
A w10 pro key is 30€ right now or so. Doesn't bother me much, if I have to buy w10 now - the upgrade was for free. It sucks that you have to buy it again - but W7 was like 80€ (and this was cheap with OEM keys).
 
I did almost 8 years with my oc'd Q6600. Best investment I ever made on a pc. And everyone told me a quadcore was overkill back then and nothing would use it. Get an E6600 instead they said. Glad I didn't they all ended up upgrading waaaay earlier.
 
This is pretty interesting.. If switch is close to xbone one specs and has a CPU that is significantly better than xbone, I wonder how the multiplay version will have more stable framerate.. Who knows about its RaaM though.

Then again, if CPUs have a huge factor on framerate, how come ps4 og multiplay games have more framerate than xbone, even though the latter has slightly better CPU?
This is only with respect to resolution as it stresses the GPU mostly. For a game to run at 60fps, both the CPU and GPU part of a frame need to be running at ~16ms. The Xbox one is a good example of this, in some limited cases with games that are CPU bound you might get less framerate drops on the Xbox one than on the PS4, though the Xbox one version might also need to run at considerably downgraded graphical settings and/or lower resolution.
 
Why on earth would you anyone be happy about this? I can't believe so many people agree with OP. If there were better CPUs coming out, then buying the current top-tier CPUs would be much cheaper than they currently are.

Not to mention if CPUs were advancing further, we would see large gains in all manners of nanotechnology that would improve AI, smartphones, etc.

Did you perhaps mean that you're glad that CPU power isn't as important in gaming anymore?
 
I did almost 8 years with my oc'd Q6600. Best investment I ever made on a pc. And everyone told me a quadcore was overkill back then and nothing would use it. Get an E6600 instead they said. Glad I didn't they all ended up upgrading waaaay earlier.

It's the same shit every time. "No need to buy an i7, games won't use the extra threads anyway", "No need to get a GPU with more than 2GB VRAM" etc.

2 years later they have to run games with medium textures and some games are eating upp all those i7 threads for breakfast.

I'm using an i7 5820K (12 threads) at 4.5GHz and I'm still very CPU limited in 64 player matches on some maps in BF1. Go down from an average of 120+fps to ~80 and GPU usage drops to 60-70%.

I would love for there to be a huge jump in CPU performance very soon, screw everyone who thinks it's a good thing we are getting 5-7% performance gains every new generation.
 
You know what? I'm glad nice things don't exist because I might be pressured to own them if they did. Screw progress. Screw anyone doing difficult loads. I'm afraid games might be made for PCs that are better than mine. Never mind that this is the exact bet you're making by going with PC gaming.
 
It's the same shit every time. "No need to buy an i7, games won't use the extra threads anyway", "No need to get a GPU with more than 2GB VRAM" etc.

2 years later they have to run games with medium textures and some games are eating upp all those i7 threads for breakfast.

I'm using an i7 5820K (12 threads) at 4.5GHz and I'm still very CPU limited in 64 player matches on some maps in BF1. Go down from an average of 120+fps to ~80 and GPU usage drops to 60-70%.

I would love for there to be a huge jump in CPU performance very soon, screw everyone who thinks it's a good thing we are getting 5-7% performance gains every new generation.
Have you tried Vulcan?
 
2500k is done. You might get away with a 2600k 4.5ghz.
3770k is the minimum. Overclocked preferred.

2500k @4,5 is still faster than anything AMD produces and still performs competitively against non-K i5 cpus.

But of course changing to i7 6700k is nice upgrade since software finally starts to use >4 threads
 
Top Bottom