• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Is there proof that Core i7 is better than i5 for gaming?

There's usually a $100 difference between the highest end i5 and i7. I've used my 4770k for 3 and a half years. In that time, I've used it for Crysis 3, handbrake, quite a bit of cg rendering, every time I've watched a 4k60 video on youtube, and every time I've played a game at 120fps. I plan on keeping it until right before next gen in 2019/2020.

$100 extra over a 6-7 year period of time for all of that is probably worth it.
 
Intel's typical MO would be to push the 6-cores upward.

Hopefully Ryzen isn't a bust, and will change that.

The thing is is that they've specifically been hinted as the very first mainstream segment 6 cores. To expand the scope of that segment upward would just be weird, and would only be good for people who don't need or want an X99 level board.
 
If you asked this question a year or two ago I would have answered no, but game engines have been advancing significantly the last few years. Also the fact that consoles have many but slow cores made games support more than 4 cores or threads. All the major recent games used all 8 threads on my core i7 processor. I have seen over 70% usage in games like Battlefield 1 and Fallout 4. Gears of wars 4 hits 100% from time to time depending on the situation and it's a kaby lake core i7. If you want high frame-rates than you need a proper CPU to be able to feed the GPU. Though I don't think a quad core core i5 would be an issue if it's clocked high enough.
 
I don't think it's worth the price difference. Paying 30-40% more for less than 10% performance improvement in ideal scenarios. Often no difference at all. Better off putting that money toward a GPU, unless money is no object and you can get top end everything.

Overclocked i5 is fine for most.

Basically this. People tend to say that HT is "free" but it's really not. Paying for HT more than $10 is something which is mostly pointless from a gaming perspective so when you have a choice of an i5 and i7 for 50% more then you should really go with the i5.

It is also true that some games are actually worse with HT than without it.

I regret buying an i5. To be fair, I did not plan on keeping my CPU for as long as I have, but I picked an i5-2500K back when it "didn't matter for games".
Now my i5 has fallen behind and has been bottlenecking games for the past year or two.

In contrast, the i7-2600K can actually beat an i5-7600K in some games now that they're making use of more than four threads.
These days it's pretty clear that you should buy an i7 and fast RAM if you want the best performance - especially if you care about minimum framerates more than averages.

That said, Ryzen is launching a couple of weeks from now and the leaks so far have been very promising.
It's looking like they will be far better value for money, with a 6c/12t CPU for a similar price as the 4c/4t i5-7600K and 8c/16t for a similar price as the 4c/8t i7-7700K - while offering better performance.
But we still need to wait for proper reviews instead of judging them based on one or two leaked benchmarks.
Even if they don't best Intel, they're still going to be very good value and there will finally be some competition in the CPU space again - which is good for everyone. (except Intel)

In one or two tests. We have no idea how it performs overall yet, and one of the tests is a bit concerning regarding memory latency.
I'm still expecting a 5GHz 7700K to be the best performer in most games, since most games do not take advantage of more than four threads - though many newer ones do now.
But if AMD can get 90% of the performance of the comparable Intel chips at these prices, it's going to have a significant impact - especially with how ridiculous Intel's pricing has been for their high end CPUs in recent years.

Not really sure where you're getting this from.

85559.png


HT is getting a lot of fame these days and the reasons for this are unclear for me. Sure, it helps sometimes but it's not the same as having the real cores in place of the virtual ones.
 
When you have a Mac it doesn't matter for gaming. Though as a vfx artist the more CPU power the better. though can you have dual i7s now or do you need to go back to Xeons for that?
 
i7 is better in the long run and you can turn off hyperthreading in the bios for those games that don't like hyperthreading.

Best of both worlds.
 
If you plan on recording games at all an i7 is essential.

If you're recording games for any "professional" reason, you should be running a completely separate machine for capture.

Trying to capture and play on the same machine is never a great idea, and the cost to upgrade a mid tier pc to a high end machine to do both can often times be more expensive.
 
Not worth it when you can spend the money saved by getting an i5 on a better GPU which has far more value and impact to your performance.
 
Not worth it when you can spend the money saved by getting an i5 on a better GPU which has far more value and impact to your performance.

"Not worth it" depends a lot on what you are actually going for.

If you only game at 60Hz, then yes, the i5 will most likely meet all your needs for some time to come (but an i7 will last you even longer as seen with the 2600K).

At 120Hz and above, you really want all the CPU power you can get. Because then you are probably using a pretty strong GPU, but play in 1080p.

Before one of the more recent BF1 patches, I was severely CPU limited even with a 6-core 5820K@4,5GHz. I still am CPU limited on certain maps, but it's much improved and I rarely drop down below 120fps now. Whereas before I could go down to 60-70% GPU utilisation and get about ~80FPS on some maps.
 
Not really sure where you're getting this from.
http://images.anandtech.com/graphs/graph11083/85559.png
HT is getting a lot of fame these days and the reasons for this are unclear for me. Sure, it helps sometimes but it's not the same as having the real cores in place of the virtual ones.
Anandtech don't know how to benchmark games.
Their tests are worse than useless - they're very misleading.

Techspot's Gears of War 4 testing shows a clear divide between i5 and i7 CPUs when you look at minimum frame rates:

GameGPU's testing shows the i7-2600K near the top - ahead of all i5s:
gw4_proz.png


There are an increasing number of games which show this sort of performance gap between i5 and i7 CPUs.
The question right now is whether a 4c/8t i7 or an 8c/16t Ryzen CPU is going to perform best in games.
 
That Techspot Gears 4 test shows that you need a Titan X to get much benefit from the i7 and clock speed matters a lot. For a 980 Ti they are very close. So I guess ultimately GPU is the main limiting factor.
GTX980Ti.png


Too bad they didn't have tests at 1440p or 4K with the different CPUs because to me that's where extra performance is starting to matter. I don't really care if a game that runs at 100 fps vs 120 fps, I can't really tell a difference at that point but between getting a stable 30+ fps at 4K vs going under would matter.
 
That Techspot Gears 4 test shows that you need a Titan X to get much benefit from the i7 and clock speed matters a lot. For a 980 Ti they are very close. So I guess ultimately GPU is the main limiting factor.
Too bad they didn't have tests at 1440p or 4K with the different CPUs because to me that's where extra performance is starting to matter. I don't really care if a game that runs at 100 fps vs 120 fps, I can't really tell a difference at that point but between getting a stable 30+ fps at 4K vs going under would matter.

Generally the resolution doesn't matter in CPU bound scenarios, if the CPU is capable of pushing 120 fps at 1080p it will be able to do the same at 4K providing the GPU power is there.
 
WD2 seems to be an example of a modern engine which scales well with CPU parallelism:
w3_proz.png


Will be interesting to see Ryzen benchmarks in that.
 
That Techspot Gears 4 test shows that you need a Titan X to get much benefit from the i7 and clock speed matters a lot. For a 980 Ti they are very close. So I guess ultimately GPU is the main limiting factor.
You don't need a Titan X. For Ultra settings in Gears of War 4, maybe, but you can be bottlenecked by the CPU with any GPU as long as you choose appropriate settings for it.
If you bottleneck the game with your GPU by choosing graphics settings that it can't handle, then of course the difference between CPUs is going to be smaller.

But the issue is that many games are being bottlenecked by the CPU even with mid-tier cards.
I feel like I'm posting this every time CPU performance is brought up, but here are some results from Deus Ex: Mankind Divided:
The game is bottlenecked by my CPU and can't run any faster than 43 FPS in this scene.
If I tried using ultra settings on my 960, it might drop below 43 FPS, but since I chose appropriate settings for that GPU, the CPU is the limiting factor. (in other scenes, the highest GPU load I saw was about 96% with these settings on the 960)
Putting in a faster GPU can't increase the performance - it only drops the GPU load.
I can turn up the graphics options much higher with a GTX 1070 in the system, but performance is still capped at 43 FPS by the CPU.

Now that's an old i5-2500K running at 4.5GHz, but I'm not sure that even an i5-7600K would be fast enough to keep the game above 60 FPS in that location, as most tests seem to put it about 30% faster on average, not >40%.
Deus Ex is also one of the recent games which scales very well with CPU threads/cores though, so an i7 would fare better.
Deus Ex had quite a few performance issues at launch though, so most of the tests out there seem to be GPU-limited rather than showing off the impact that a faster CPU can have in some areas of the game.

WD2 seems to be an example of a modern engine which scales well with CPU parallelism: http://gamegpu.com/images/stories/Test_GPU/Action/WATCH_DOGS_2/new/w3_proz.png
Will be interesting to see Ryzen benchmarks in that.
Yes, I'm really looking forward to seeing how Ryzen compares across many different games, since their CPU requirements can be so varied.
Some only care about single-core performance, some scale well up to four threads, some scale well beyond four threads, some are very reliant on memory bandwidth/latency etc.
 
For gaming you won't see a difference between the i5 and i7. There's been some games that benefit (marginally) from an i7 but that list is short and those games are old.

The only games I've seen reported as showing, sometimes a significant difference, are new, not old.

Gears of war 4
Watch Dogs 2
Battlefield 1
Assassin's Creed Unity
Rise of the Tomb Raider
The Witcher 3

Maybe having just a better GPU is a better choice in most situations than spending on an i7 since CPU limited situations aren't super frequent but I still think 8 threads/cores are the way forward.
 
Yes, I'm really looking forward to seeing how Ryzen compares across many different games, since their CPU requirements can be so varied.
Some only care about single-core performance, some scale well up to four threads, some scale well beyond four threads, some are very reliant on memory bandwidth/latency etc.
Yes, I hope some reputable sites do frametime tests in CPU-limited scenarios right off the bat. That's what I'm most interested in. (Well, that and OC potential)
 
Still rocking the immortal 2500k, while as of now I'm just waiting for Canonlake to hit to build a whole new rig.
At first I was actually sure I'd go with an i7 K if there is no horrible price tag between us, but if by that time I can get an i5 with at least 6 cores, I will definitely ask this question and reconsider, for turning that extra money into more GPU power sounds reasonable. Also one more question remains, when will the crazy RAM prices normalize...
 
Basically this. People tend to say that HT is "free" but it's really not. Paying for HT more than $10 is something which is mostly pointless from a gaming perspective so when you have a choice of an i5 and i7 for 50% more then you should really go with the i5.

It is also true that some games are actually worse with HT than without it.
Fewer and fewer games run better without HT. When I upgraded my PC a couple of years back, I benchmarked my games with it on, and with it off. I think only GTA4 gave me better results.

But of course, if you have an I7, you can still disable HT for the increasingly small minority of games that run better that way.
 
It would be extremely stupid to upgrade today, when AMD has new CPUs coming in a few weeks. Even if you might not be buying a CPU from them.

I just got an i7-7700 a week or two ago, for $315
I'll leave this post here to see how much of a difference that AMD's CPUs make in whether or not that was a worthwhile decision.

The constant "Wait for (y) before buying (x)" with computer parts is rather tedious. At some point one just needs to decide on what to get, and get it.
 

BREAKING: Random user on Windows 7 records the opposite results of what he should be getting with a properly configured PC. News at 11.

Half a decade ago unless a game was hitting the limits of a non-HT CPU using HT would have a small performance drop vs having it off, but that hasn't been true for quite some time.

The main benefits for going for an i7 is better framepacing, framerate stability, higher minimum framerates, the ability to have small background processes running like music players|browsers|keyboard/mouse software|OC utilities etc. running in the background without hindering game performance. Not to mention an i7 can become an i5 with a toggle in BIOS or processlasso type program in the off chance HT is messing with an application. Unless the extra $75-$100 is impossible to squeeze out there is no valid reason to go with an i5 with all of the benefits an i7 offers. You can even delay upgrading a generation or two with the extra performance squeezed from a hyperthreaded processor.
 
Anandtech don't know how to benchmark games.
Their tests are worse than useless - they're very misleading.

On the contrary, their benchmarks are actually very relevant to how most people will experience these games since only a handful of them will play games in 720p/low or on a TitanXP.

The minimal fps example is a curious one and I wonder if that's something unique to Gears4 since I haven't seen this type of behavior anywhere else yet.

Durante's example of WD2 CPU benchmark shows that in a properly parallelized engine a modern i5 is ahead of an old i7 so as I've said there's only so much benefit HT may give you, virtual cores are just that - virtual cores.
 
On the contrary, their benchmarks are actually very relevant to how most people will experience these games since only a handful of them will play games in 720p/low or on a TitanXP.
You have to use a high-end GPU and low resolutions in a CPU test to ensure that performance is not affected by the GPU in any way.
If the tests are GPU-limited or only look at average framerates, then it's not measure of CPU performance.
You might not have a Titan X today, but that level of performance will probably be available for <$400 next year, and even less the year after that.
Most people seem to keep their CPUs far longer than they keep GPUs.

If you're the type of person who always runs games at ultra settings without any consideration for whether the GPU you're using can actually handle that properly, then CPU performance probably isn't that important to you.
But if you're the type of person that values consistently smooth gameplay and keeping minimum framerates as high as possible - or even just above 60 FPS - then CPU performance and memory bandwidth/latency are very important things.

I played through Deus Ex at 720p50 on a GTX 960 with a mixture of medium/low settings because that guaranteed me a smooth gaming experience for 95% of the game, instead of trying to run at 1080p60 on high settings and having it drop frames all the time.
If I had done the latter, CPU performance wouldn't have mattered at all because the game would be closer to 30 FPS than 60 most of the time.
But once I selected options which weren't causing the GPU to bottleneck things, the CPU became the limiting factor for performance.
So you certainly don't need a Titan X to have performance limited by the CPU.

I'm of the opinion that minimum framerates/0.1% frame times are critically important for gaming, and that averages really don't mean much for the overall gameplay experience.

Durante's example of WD2 CPU benchmark shows that in a properly parallelized engine a modern i5 is ahead of an old i7 so as I've said there's only so much benefit HT may give you, virtual cores are just that - virtual cores.
Check the results again, the i7-2600K is ahead of the i5-6600K when you look at minimum framerates.
Just barely, but that's still a CPU which is 5 years older coming out on top due to hyperthreading support.
The i5-2500K is significantly behind it, when most benchmarks back when they were released suggested that performance was either the same, or actually in favor of the i5 in games - which is why I decided to put that $100 towards my video card instead and buy the i5.
I've kept that system for six years now and have been through two other video cards. Saving that $100 back then absolutely was not worth it.

It's a similar situation with RAM, where I went with DDR3-1600 due to the tight 8-8-8-24 timings vs higher clocked RAM with looser timings.
When you look at gaming performance now, faster clocked RAM makes far more difference than tighter memory timings.
 
I'm of the opinion that minimum framerates/0.1% frame times are critically important for gaming, and that averages really don't mean much for the overall gameplay experience.
I agree with this, which is why I think it's a shame that average FPS (and even minimum FPS) are still by far the most widespread metric.
 
One of the most popular PC games in Europe is the Football Manager series. I'm not sure if many of you are familiar with it, but it has the tag of being a bit of a spreadsheet simulator. Graphically, it really isn't much of anything at all. It relies heavily on CPU, so having an i7 over an i5 will bring big improvements. Meanwhile, a graphics card isn't even necessary.
 
Check the results again, the i7-2600K is ahead of the i5-6600K when you look at minimum framerates.

But that's just it, this minimum framerate result is completely irrelevant as it's just one worst second result from the whole benchmark run. I always wonder why people don't understand this, especially when they are so fast at saying bad things about average framerate which at least shows the average performance for the whole benchmark. Minimal spikes can be cause by anything really and are not an indication of a worse/better CPU performance - unless they are showing a consistent picture between several runs and several different CPU models.
 
Top Bottom