• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Anandtech reviews the new Intel i5 7600k & i7 7700k

Finally being able to stream 4k content from Amazon and Netflix is nice though. I don't think there are any other CPUs out there that allow you to do that.

But in no way is that worth upgrading from what I've got now.
 
Be aware that Anandtech's gaming benchmarks are absolutely worthless.
So few sites seem to know how to actually run a CPU-limited game benchmark now, rather than selecting games where the CPU doesn't matter, or having their test be GPU-limited.

CPU tests in games should be performed with the fastest GPU possible (1080 or Titan XP) at a low enough resolution that GPU load never comes anywhere close to 100%.
720p would be more appropriate than 1080p in many recent games.

If the test is looking at average framerates and not minimums, the results won't tell you anything useful.
Frametime graphs/percentiles are an even better metric.

A 2500K doesn't cut it any more if you're trying to play new games and your GPU is a GTX 960 or faster.
And new i5s (6600K) are losing to old i7s (2600K) now that many games are starting to make use of more than four threads.
 
Those gaming benchmarks are weird, seems like every single one of them is bottlenecked by the GPU. Why would you test CPUs on ultra settings?
 
Are you seriously trying to suggest that 4k on TVs is rare? It is so common and affordable that almost every new 40+ TV is 4k.
Then why I (and not only me) don't have it if it's so affordable? I guess there's some shitty 4K TVs available but I ain't talking about these ya know. Same goes for 4K monitors and actually decent ones starting from 27'' and above.

Shitty 4K Monitor for poor people / Decent 43" 4K Monitor for not so poor people and Decent 27'' 4K Monitor for not so poor people

Now, I can't buy any of these decent ones cuz I need to work like half a year to buy one and this is without eating and drinking anything for half a year. So yeah, very affordable indeed.
 
Be aware that Anandtech's gaming benchmarks are absolutely worthless.
So few sites seem to know how to actually run a CPU-limited game benchmark now, rather than selecting games where the CPU doesn't matter, or having their test be GPU-limited.

CPU tests in games should be performed with the fastest GPU possible (1080 or Titan XP) at a low enough resolution that GPU load never comes anywhere close to 100%.
720p would be more appropriate than 1080p in many recent games.

If the test is looking at average framerates and not minimums, the results won't tell you anything useful.
Frametime graphs/percentiles are an even better metric.

A 2500K doesn't cut it any more if you're trying to play new games and your GPU is a GTX 960 or faster.
And new i5s (6600K) are losing to old i7s (2600K) now that many games are starting to make use of more than four threads.

This. Very much this.

I went from a 2500k to an i7 6700k in 2016 and the improvements in the games I was playing (The Witcher 3, especially) were noticeable and appreciated all with the same 980Ti. 2500k might be adequate in some things but there are noticeable improvements to be made by upgrading CPU/Mobo/RAM to a Skylake or Kaby Lake at this point.
 
I'm still on Z97 with an i5 4460. I decided to cheap out as I never intended to overclock my CPU. I don't seem to be missing out on much between what I have and these latest processors. Maybe a couple of frames here or there. I'll just put my money into upgrading my GTX 970 when the time is right.
 
Those gaming benchmarks are weird, seems like every single one of them is bottlenecked by the GPU. Why would you test CPUs on ultra settings?

The whole article is a shit show. Anandtech has fallen since Anand left.

I stopped reading on page 1 when he stated that Mac and Linux didn't support the speed shifting... 2 minutes of googling and I found a patch from an Intel employee for Linux back in 2014 that supported it (Broadwell was in dev at the time, same tech) and MacOS has also supported it for some time.

If you can't do your research you really shouldn't be writing reviews IMO.
 
Be aware that Anandtech's gaming benchmarks are absolutely worthless.
So few sites seem to know how to actually run a CPU-limited game benchmark now, rather than selecting games where the CPU doesn't matter, or having their test be GPU-limited.

CPU tests in games should be performed with the fastest GPU possible (1080 or Titan XP) at a low enough resolution that GPU load never comes anywhere close to 100%.
720p would be more appropriate than 1080p in many recent games.

If the test is looking at average framerates and not minimums, the results won't tell you anything useful.
Frametime graphs/percentiles are an even better metric.

A 2500K doesn't cut it any more if you're trying to play new games and your GPU is a GTX 960 or faster.
And new i5s (6600K) are losing to old i7s (2600K) now that many games are starting to make use of more than four threads.
Most everything I've played is just fine on an overclocked 3570k.

There's some extra performance to be gained for sure but I'm not hurting yet. I want to hit up a new cpu, ram, and mobo all at once.
 
B
CPU tests in games should be performed with the fastest GPU possible (1080 or Titan XP) at a low enough resolution that GPU load never comes anywhere close to 100%.
720p would be more appropriate than 1080p in many recent games.

I'd much rather have professional sites run these benchmarks at settings and resolutions that actually make sense and reflect real world usage. Who the fuck games at 720p? What I, and I think most people, are looking for in these benchmarks is how it affects performance in the real world. No one is buying a $400 i7 so that they can game at 720P.
 
Those gaming benchmarks are weird, seems like every single one of them is bottlenecked by the GPU. Why would you test CPUs on ultra settings?

Because it's Anandtech - their cpu and memory benchmarks in games have been useless for several years already.

But AMD fans like them for obvious reasons :D
 
I'd much rather have professional sites run these benchmarks at settings and resolutions that actually make sense and reflect real world usage. Who the fuck games at 720p? What I, and I think most people, are looking for in these benchmarks is how it affects performance in the real world. No one is buying a $400 i7 so that they can game at 720P.
Congratulations, you also don't understand how CPU benchmarking in games should be performed.
Resolution doesn't mean much to the CPU, it's framerate which hits the CPU hard.
If you are performing a CPU test, you have to eliminate the effect of the GPU.
That means running games at 720p or even lower resolutions.

This way you can see how many frames a certain CPU can push.
You might see that a 7700K has a minimum framerate of 80 FPS while a 2500K has a minimum framerate of 50 FPS.

Then you can look at GPU benchmarks and see that your current GPU can only run the same test at 40 FPS - so you would be GPU limited and it would not matter which of the two CPUs you had.
You might check benchmarks for a faster card which shows that it can reach 70 FPS in the same test, so now you can see that you would need to pair it with a faster CPU than a 2500K, since it limits the game to 50 FPS.

Without eliminating the GPU from the equation, the test only shows how that specific PC performs, and not how that CPU performs regardless of the GPU it's paired with - so the results aren't useful to anyone.

Here's an example from Deus Ex: Mankind Divided:
It doesn't matter how fast a GPU you put in that system or how much you reduce the resolution, you're not getting more than 43 FPS from that CPU.
You can increase the resolution even higher and drop below 43 FPS, but nothing other than upgrading the CPU will increase the framerate.

If the average FPS is that close, I doubt that the minimum FPS would have a significant disparity. But yes, it'd be nice to see more detailed benchmarks. I'm not holding my breath on any major improvements.
cpu_030lkr3.png


http://www.techspot.com/review/1263-gears-of-war-4-benchmarks/page4.html

Look at the difference between averages and minimums on the i5s and i7s in this test.
Averages are not that far apart, but there's a huge gap when you look at minimums.

The whole article is a shit show. Anandtech has fallen since Anand left.

I stopped reading on page 1 when he stated that Mac and Linux didn't support the speed shifting... 2 minutes of googling and I found a patch from an Intel employee for Linux back in 2014 that supported it (Broadwell was in dev at the time, same tech) and MacOS has also supported it for some time.

If you can't do your research you really shouldn't be writing reviews IMO.
You must be confusing Speed Shift with something else. It was introduced with Skylake (2015) so I doubt there was a Linux patch for it in 2014.
 
I'd much rather have professional sites run these benchmarks at settings and resolutions that actually make sense and reflect real world usage. Who the fuck games at 720p? What I, and I think most people, are looking for in these benchmarks is how it affects performance in the real world. No one is buying a $400 i7 so that they can game at 720P.

But it doesn't make sense to test them like that if you want to know how they actually compare against each other when they do get tasked with something where the CPU is the bottleneck.

Just looked at some of the benchmarks they did and it showed an i3 at the top in one of them. If they were to test BF1 in 64 player conquest that i3 would get so incredibly much lower fps than the 7700K even with normal play settings.

The problem is that it's hard to get good benchmark runs in multiplayer, so everyone just test singleplayer where the CPU load is not even close to what it is in MP.

These GPU limited benches won't tell you shit about how the CPU will hold up when something CPU intensive like BF1 comes along.
 
I still don't see any reason to upgrade my 3770k.
It's probably about 40% performance improvement and potentially decent OC with the latest chip (should OC better then Skylake). I am on the same CPU and It's getting to the point where it's almost worth it.

Personally though I want more then 4 cores but at the 7700k/6700k clock. The real issue is that upgrading CPU requires buying new motherboard and RAM and potentially new cooling unit.
 
You must be confusing Speed Shift with something else. It was introduced with Skylake (2015) so I doubt there was a Linux patch for it in 2014.

https://patchwork.kernel.org/patch/5246361/

You might be right about Broadwell not supporting it, but Speed shift (aka hardware p-states in non marketing speak) have definitely been in the kernel since 2014. That's in git of course, the first stable version may be a few months out from that.

Also consider that Intel usually has new features land in Linux months before shipping hardware. They like to have a lead in the kernel since many distributions use older kernels for stability.
 
https://patchwork.kernel.org/patch/5246361/

You might be right about Broadwell not supporting it, but Speed shift (aka hardware p-states in non marketing speak) have definitely been in the kernel since 2014. That's in git of course, the first stable version may be a few months out from that.

Also consider that Intel usually has new features land in Linux months before shipping hardware. They like to have a lead in the kernel since many distributions use older kernels for stability.
Assuming this is the same thing - and it certainly looks like it - I stand corrected.
 
Are these CPUs capable of being OCed in a similar way to the i5 2500k? Mine is OCed to 4.6ghz and I play at 1080. Wondering if I should upgrade, I get 60fps for every game I play with my current set up, 16 gigabytes of ram and SLI 980 Gtx.
 
Are these CPUs capable of being OCed in a similar way to the i5 2500k? Mine is OCed to 4.6ghz and I play at 1080. Wondering if I should upgrade, I get 60fps for every game I play with my current set up, 16 gigabytes of ram and SLI 980 Gtx.

I've seen 2 reviewers get their chips up to 5GHz. Might not be indicative of all chips, but it seems very possible to reach very high clocks.
 
Intel needs competition. They have gotten completely complacent for anything but power consumption. Please AMD, let your next processor challenge on the high end for once to shake Intel out of complacency.
 
How many people have 4K monitors or TVs?

Then why I (and not only me) don't have it if it's so affordable? I guess there's some shitty 4K TVs available but I ain't talking about these ya know. Same goes for 4K monitors and actually decent ones starting from 27'' and above.

Shitty 4K Monitor for poor people / Decent 43" 4K Monitor for not so poor people and Decent 27'' 4K Monitor for not so poor people

Now, I can't buy any of these decent ones cuz I need to work like half a year to buy one and this is without eating and drinking anything for half a year. So yeah, very affordable indeed.

Why are you coming into a thread about top of the line tech and getting mad at people who are discussing the advancements of said tech. If you can't afford it then I'm sorry. But you shouldn't get all twisted up just because a CPU is capable of streaming 4K from services besides YouTube.
 
I have a three year old i5 4570k and I've been wondering if I should replace it or not. I definitely get some stutter in open world games and I'm pretty sure the CPU is the reason for that. But also it feels like the improvements in Intel CPUs in the past few years have been so minor that I shouldn't even bother.

Hopefully that new AMD CPU lives up to the hype.
 
I have a three year old i5 4570k and I've been wondering if I should replace it or not. I definitely get some stutter in open world games and I'm pretty sure the CPU is the reason for that. But also it feels like the improvements in Intel CPUs in the past few years have been so minor that I shouldn't even bother.

Hopefully that new AMD CPU lives up to the hype.

if zen is what AMD has promised it'll probably be better than a 4570k

the biggest advantage to upgrading these days is new platform features like PCI-E 3, usb 3.1, nvme, etc
 
I think it's finally time to retire the 2500k.

The minimum framerates are killing me.

Is you cpu overclocked? You can overclock it to 4.5 or 4.6 easily with a custom fan. Do you have ddr3 ram? Are you sure it's not your GPU? What resolution do you play at?

Mine is OCed to 4.6ghz and it's paired with sli 980 Gtx and minimum frames I get at 1080p almost never dip below 60fps on all games at max settings.
 
I'm only now just starting to feel the age of my CPU (3570k OC to 4.7ghz). My GTX 1080 is definitely getting bottlenecked by it in some of the more recent titles like BF1.

Maybe this year will finally be the year where I upgrade my CPU, Mobo & Memory?
 
These benches seem severely GPU limited.

They are not go look on youtube like paul's hardware among other famous tech youtubers and you'll see how minimal if at all the Kabeylake performance increase is.

It's kind of garbage actually.

I hope Ryzen at least puts heat on intel to start dropping prices instead of price gauging like nvidia has been doing for some time now.
 
Same here, I keep saying, maybe next year will be the one. It never is.

Maybe next year though.

After a few years of this there will be a dramatic increase in the total change in performance between what you have and what is available though, so isn't this just good for us as consumers?
 
Why are you coming into a thread about top of the line tech and getting mad at people who are discussing the advancements of said tech. If you can't afford it then I'm sorry. But you shouldn't get all twisted up just because a CPU is capable of streaming 4K from services besides YouTube.
I'm not mad, it's just.... wrong to say such things when decent stuff is not really all that affordable for everyone. Now, I've hi-end rig and I can afford hi-end stuff, but I'm not gonna buy shitty hardware and decent one is not currently affordable and too expensive. And If I'm gonna buy shitty hardware then I might as well sell my hi-end rig and use a potato pc instead))))))))))))

And again, I'm not the only one who can't afford this decent 4K stuff and when it will affordable - everyone will have decent 4K TV and 4K monitor.... just not right now and at a current date and year. 4K is only getting there, just like Blu-Ray once was and now they are common and actually affordable for everyone to buy and have at their home.

Oh and, decent stuff is just the tip of the iceberg and really not all that expensive compared to 150K RUB monitors or TVs. So yeah, decent stuff is not even the best there is which you can buy.

Nowadays you can have 500$ 55'' 4K TVs, cheap ones though, but still. The average Joe can buy one without losing an arm.
Is this average Joe does not have better things to do with his life?))))))

I mean, I guess he can get a credit and then work his ass 24/7 to pay for this god knows how long.... But hey, at least he will have a 55'' 4K TV in his house. Cool, right? Well, no. 500$ is a lot of money which can be spent on something else and alot more important. I'm not telling people what to do, but this is not something average Joe can buy, I'm sorry, it's not.
 
I mean, I guess he can get a credit and then work his ass 24/7 to pay for this god knows how long.... But hey, at least he will have a 55'' 4K TV in his house. Cool, right? Well, no. 500$ is a lot of money which can be spent on something else and alot more important. I'm not telling people what to do, but this is not something average Joe can buy, I'm sorry, it's not.

$500 for a TV is definitely something an "average Joe" can afford to spend on a TV if he saved up for some time.

Average Joe is not someone who can barely make it through the month, you are below the average person if it's like that and if that's the case, then yes, a new TV should not be a priority.

But 500 bucks for a 4K TV is definitely very affordable for most people.
 
Maybe a good time to upgrade my i7 920.. Or maybe not

I haven't bothered due to 1080P/60 still being achievable on this setup (R9 290X 4GB / i7 920 oc 3.6Ghz HT enabled / 16GB / Win 10 / Samsung 850 EVO 500GB SSD / 2TB Toshiba HDD)

The CPU's overclock was essential tho, below 3.4Ghz causes real time bottlenecking that I've never seen before, started after upgrading from a GTX 670 to R9 290X (random massive stuttering & frame drops which went away completely after pushing CPU beyond previous 3.2Ghz overclock).

Played through Fallout 4 on my X1 and recently started playing it on PC, It's amazing how much better it runs on my PC to be honest, The load times alone are insanely faster, plus 60fps vs. X1's 30fps (with dips below 30).
 
I'm not mad, it's just.... wrong to say such things when decent stuff is not really all that affordable for everyone. Now, I've hi-end rig and I can afford hi-end stuff, but I'm not gonna buy shitty hardware and decent one is not currently affordable and too expensive. And If I'm gonna buy shitty hardware then I might as well sell my hi-end rig and use a potato pc instead))))))))))))

And again, I'm not the only one who can't afford this decent 4K stuff and when it will affordable - everyone will have decent 4K TV and 4K monitor.... just not right now and at a current date and year. 4K is only getting there, just like Blu-Ray once was and now they are common and actually affordable for everyone to buy and have at their home.

Oh and, decent stuff is just the tip of the iceberg and really not all that expensive compared to 150K RUB monitors or TVs. So yeah, decent stuff is not even the best there is which you can buy.


Is this average Joe does not have better things to do with his life?))))))

I mean, I guess he can get a credit and then work his ass 24/7 to pay for this god knows how long.... But hey, at least he will have a 55'' 4K TV in his house. Cool, right? Well, no. 500$ is a lot of money which can be spent on something else and alot more important. I'm not telling people what to do, but this is not something average Joe can buy, I'm sorry, it's not.

You bringing up your income situation is totatally off topic. You're barging into a tech thread and trying to unjustify an advancement made to a CPU because people at certain income levels don't own a TV or monitor that supports 4K. It's good that you have your priorities straight, but does that really need to be stated here? If 500 is too much for you then why are you in in a thread about high end enthusiast CPUS?

I used to make very little too in my younger years, but seeing new hardware just made me want to work harder... not be snide to those who can afford and enjoy stuff like this.
 
Top Bottom