• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Skylake review thread

Without taking into account money,

would it be worth my time (literally time) upgrading from a stock i7-3930k to the i7-6700k? Going from 6 core to 4 core?
 
Without taking into account money,

would it be worth my time upgrading from a stock i7-3930k to the i7-6700k? Going from 6 core to 4 core?

Personally, I don't think so. Even if you have money to burn. They're just so close that the 6700k would pretty much be a sidegrade. Bit better in games that are single-core heavy, and bit worse in things like video editing.

Unless you really want the board features, the upgrade seems like more of a hassle than it's worth.

I'm upgrading to the 6700k once they actually launch in North America, but I'm coming from a seven-year-old 920 that doesn't even support SATA-III. In large part for improved performance in Dolphin. Got to get the more consistent 1080p 60fps 16:9 Super Mario Sunshine. :)
 

mkenyon

Banned
Without taking into account money,

would it be worth my time (literally time) upgrading from a stock i7-3930k to the i7-6700k? Going from 6 core to 4 core?
Nope.
They're using a Titan X card in 1080p with no MSAA. How would that be GPU limited?
Your frame of reference is that "GPU limited" implies a comparison, that more performance is possible with a better GPU.

However, it just means that the processor is chugging out all that it can, with room for more, and that the total overall limit on the framerate is on the GPU.
 

Oxn

Member
Without taking into account money,

would it be worth my time (literally time) upgrading from a stock i7-3930k to the i7-6700k? Going from 6 core to 4 core?

I will never understand why people want to "upgrade" for the sake of upgrading.

it doesnt even sound like you have any reason to consider this at ALL.
 
I will never understand why people want to "upgrade" for the sake of upgrading.

it doesnt even sound like you have any reason to consider this at ALL.

I might end up getting this stuff for free, is why I'm asking.

Also my USB's on my current Sabertooth x79 are shit.
 

mkenyon

Banned
I might end up getting this stuff for free, is why I'm asking.

Also my USB's on my current Sabertooth x79 are shit.
Different situation then. I mean, if you're having some random issues with your current setup, and it's free, then why not?
I will never understand why people want to "upgrade" for the sake of upgrading.

it doesnt even sound like you have any reason to consider this at ALL.
Because some people enjoy the hobby of PC Building, tinkering with new tech, and min/maxing to see different numbers on benchmarks. You don't have to enjoy it, but all that stuff can be more compelling than playing games for some folks.
 
Different situation then. I mean, if you're having some random issues with your current setup, and it's free, then why not?

Because some people enjoy the hobby of PC Building, tinkering with new tech, and min/maxing to see different numbers on benchmarks. You don't have to enjoy it, but all that stuff can be more compelling than playing games for some folks.

The issues aren't the be all end all, but are huge headaches. I've tried everything from powered USB hubs and what not, just the USB system blows on that board. What is the point of having so many ports if you can't use half of them at once w/o speed drops?
 

Tovarisc

Member
That ipc increase is mediocre. I hope this release drives down prices on the 4690k.

If Skylake is "bad/mediocre" CPU release then why would price of "stronger" platform go down instead of up? If Skylake is somehow by some metric deemed as failure then price of 4690K will go up, not down.

Edit: Also Intel doesn't discount their old models, they just stop support and production
 
Edit: Also Intel doesn't discount their old models, they just stop support and production
tvrNXxp.jpg
 

Shredderi

Member
That's it? I'm the one who is ignorant so it's my fault for expecting too much or not knowing what to expect but I was expecting a huge "next-gen" level improvement because of some things I early this year. Thought it was gonna be a huge leap for some reason. Not that I was gonna be upgrading for a few years anyway.
 
Edit: Also Intel doesn't discount their old models, they just stop support and production use the old fabs for different things

They actually use their older production facilites for less expensive chips and other purposes. The PCH is usually a process node behind the CPU, Intel has foundry customers that will use the capacity, eDRAM on the CPUs is manufactured using the 22nm process, flash memory is on older processes, dirt cheap Atom processors are on older processes.
 

Tovarisc

Member
That's it? I'm the one who is ignorant so it's my fault for expecting too much or not knowing what to expect but I was expecting a huge "next-gen" level improvement because of some things I early this year. Thought it was gonna be a huge leap for some reason. Not that I was gonna be upgrading for a few years anyway.

Shame that OP didn't add more benches to OP and kept what he cherry?picked from start.

Here some gaming numbers when game gets CPU limited: http://www.neogaf.com/forum/showpost.php?p=174456843&postcount=178

DigitalFoundry videos on said numbers: http://www.neogaf.com/forum/showpost.php?p=174626037&postcount=197

More benches to look over: http://www.neogaf.com/forum/showpost.php?p=174281616&postcount=54

True that Skylake doesn't blow minds in most benches, but it's still solid Tock phase from Intel with expected performance increases to go with chipset update. Also very nice performance increases in gaming situations where CPU has to step up.
 
Shame that OP didn't add more benches to OP and kept what he cherry?picked from start.

Here some gaming numbers when game gets CPU limited: http://www.neogaf.com/forum/showpost.php?p=174456843&postcount=178

DigitalFoundry videos on said numbers: http://www.neogaf.com/forum/showpost.php?p=174626037&postcount=197

More benches to look over: http://www.neogaf.com/forum/showpost.php?p=174281616&postcount=54

True that Skylake doesn't blow minds in most benches, but it's still solid Tock phase from Intel with expected performance increases to go with chipset update. Also very nice performance increases in gaming situations where CPU has to step up.

Them Witcher 3 numbers don't seem right. How was the testing done? Was this a case of them using an old bench mark results for the Haswell chip? Since 6600K is newer it be benchmarked with a patched version of W3 and updated GOU drivers as well
 
I'm upgrading for the chipset as much as anything. I'm currently on the P67 chipset. It definitely feels it's age. The review that PCper did with GTA V also convinced me. GTA V on my 2600k is on the edge. I feel like combining my 980 Ti with a oced 6700k will make things a little more smoother when you include things like frame pacing. Not dramatically, but noticeable for a picky person such as myself.
 

Tovarisc

Member
Them Witcher 3 numbers don't seem right. How was the testing done? Was this a case of them using an old bench mark results for the Haswell chip? Since 6600K is newer it be benchmarked with a patched version of W3 and updated GOU drivers as well

Videos showing real time graphs for all CPU's suggests that they ran all benches with latest versions of the game using settings that minimize GPU bottleneck/affect on CPU performance. Why?
 

Evo X

Member
More proof that Skylake scales hugely with faster memory. 17.3% better performance in Watch Dogs when comparing 2133mhz ram to 3333mhz.

SkylakeDDR4_Watch_dogs.jpg
 

Hanzou

Member
A question about cups and new direct 12. In that last thread people were talking that with direct 12 games would no longer be could bound, is that true? Does direct 12 mean that there is almost no gain in gaming by upgrading your cup as long as it is fairly recent. I am running an older i5 760 with a smallish over clock and have been thinking of upgrading for the last year but if what I hear is true. A gpu upgrade would be a much better decision.
 

tokkun

Member
Shame that OP didn't add more benches to OP and kept what he cherry?picked from start.

Here some gaming numbers when game gets CPU limited: http://www.neogaf.com/forum/showpost.php?p=174456843&postcount=178

That is a pretty ridiculously contrived comparison. By the time Battlefield becomes CPU-limited, it is already running at > 120 fps. At that point, are you going to benefit from a 10% increase?

I think that most people who are going to shell out for a Titan X would rather play at high resolutions, and go back to being GPU-limited, than play Shadow of Mordor at 130 FPS or CoD at 200 FPS. Only one of the games on there is getting less than 70 FPS even with the 2500K at stock clocks! If they had actually overclocked them - which is pretty much the whole point of buying a K SKU, it would have been even more of a joke.

Reminds me of the good ol' days when tech sites would include Quake running at 480p in their CPU benchmarks and would act all excited about how the new CPU increased their framerate from 250 to 270.
 
That is a pretty ridiculously contrived comparison. By the time Battlefield becomes CPU-limited, it is already running at > 120 fps. At that point, are you going to benefit from a 10% increase?

I think that most people who are going to shell out for a Titan X would rather play at high resolutions, and go back to being GPU-limited, than play Shadow of Mordor at 130 FPS or CoD at 200 FPS. Only one of the games on there is getting less than 70 FPS even with the 2500K at stock clocks! If they had actually overclocked them - which is pretty much the whole point of buying a K SKU, it would have been even more of a joke.

Reminds me of the good ol' days when tech sites would include Quake running at 480p in their CPU benchmarks and would act all excited about how the new CPU increased their framerate from 250 to 270.

How else would you like to compare CPUs? The reason they put them on low settings like that is that so that the games are CPU instead of GPU limited because otherwise you don't see differences between the framerates.

Also, it can still be be very relevant for people that game at 120 FPS or games that are way more CPU limited. It doesn't matter when a CPU runs a game at 240 instead of 200 FPS, but you have an indication of how much better it performs and get an idea how much better it would be in whatever is your use-case for the CPU.
 

Tovarisc

Member
That is a pretty ridiculously contrived comparison. By the time Battlefield becomes CPU-limited, it is already running at > 120 fps. At that point, are you going to benefit from a 10% increase?

I think that most people who are going to shell out for a Titan X would rather play at high resolutions, and go back to being GPU-limited, than play Shadow of Mordor at 130 FPS or CoD at 200 FPS. Only one of the games on there is getting less than 70 FPS even with the 2500K at stock clocks! If they had actually overclocked them - which is pretty much the whole point of buying a K SKU, it would have been even more of a joke.

Reminds me of the good ol' days when tech sites would include Quake running at 480p in their CPU benchmarks and would act all excited about how the new CPU increased their framerate from 250 to 270.

Well of course they would be excited about such FPS increase in Quake when running into CPU bound situations because it shows that CPU tech has improved. It's the point all of these benches, to test if CPU tech has actually advanced and by how much. We aren't here to bench TitanX.

First people complain and moan how Skylake didn't give 15+% FPS increase in GPU limited normal gaming situations, but when it's shown that Skylake actually gives noticeable performance increase in CPU bound situations even that isn't good enough? Maybe I should dug up some benches for previous Intel generations and see if there has been noticeable FPS gains in GPU limited gaming from upgrading CPU.

This isn't specifically aimed at you Tokkun, just find it weird how people on different forums keep moving goal posts for what Skylake should be doing to be "acceptable/success".
 

tokkun

Member
How else would you like to compare CPUs? The reason they put them on low settings like that is that so that the games are CPU instead of GPU limited because otherwise you don't see differences between the framerates.

Also, it can still be be very relevant for people that game at 120 FPS or games that are way more CPU limited. It doesn't matter when a CPU runs a game at 240 instead of 200 FPS, but you have an indication of how much better it performs and get an idea how much better it would be in whatever is your use-case for the CPU.

I want them to benchmark CPUs in scenarios that are realistic for the people who are thinking about purchasing them. If that scenario happens to show no appreciable difference, that is a good thing for people to know. Telling the reader that a product is not worth buying is arguably the most important information you can give them. Unfortunately most people reading that review are hoping that the new tech will be awesome, so there is this perverse incentive for review sites to try to demonstrate the biggest differences they possibly can.

Rather than caring about informing the reader's decision about whether or not to purchase in a meaningful way, these sites get it in their head that they need to come up with a scenario that shows a difference between the products they are reviewing because otherwise the readers will not bother clicking in the future. Who wants to read a review of a CPU when it just says that it performs nearly identically the every CPU from the last 5 years? This is also why you see more and more video encoding benchmarks showing up in reviews, because they are one of the few applications that will keep showing noticeable performance improvements year-over-year.

Well of course they would be excited about such FPS increase in Quake when running into CPU bound situations because it shows that CPU tech has improved. It's the point all of these benches, to test if CPU tech has actually advanced and by how much. We aren't here to bench TitanX.

Do you ever wonder why sites don't benchmark how these CPUs perform when running on Mars? That higher temperature would probably skew the results, don't you think?

I'm not trying to be flippant; I'm just trying to demonstrate the point that just because we can come up with a benchmarking scenario that shows a difference doesn't mean that it is a useful one to focus on, unless of course you are writing for marsgamer.com.

To me, CPU tech is really only improving when it is getting better at doing the things its audience actually does already or would like to do. I do acknowledge there is some grey area here, where it may be useful to try to predict how a product might perform in some future scenario that is not here yet, but personally I just don't see those benchmarks you posted as providing any useful insight in that way. If anything, it seems like the trend might be toward less reliance on the CPU if DX12 actually provides any of the benefit that is claimed.

First people complain and moan how Skylake didn't give 15+% FPS increase in GPU limited normal gaming situations, but when it's shown that Skylake actually gives noticeable performance increase in CPU bound situations even that isn't good enough? Maybe I should dug up some benches for previous Intel generations and see if there has been noticeable FPS gains in GPU limited gaming from upgrading CPU.

The problem there is that people are not well-informed on the fact that games are mostly GPU-bound. The appropriate solution is not to trick them into thinking that games are usually CPU-bound by coming up with some contrived scenario that does not reflect the real world.

Honestly, how big is the audience of people who own a monitor that does > 120 Hz, want to play at > 120 fps in 1080p rather than going to a higher resolution, and also own/want a K processor but don't want to overclock it? Less than 1%?

It might not have been so egregious if Eurogamer had spent more time showing the more realistic scenarios where the game is GPU bound before saying "by the way, if you are in this extremely niche audience, then it would help you". Yet it was the only comparison they did in the entire review. If that was the only review you read, you would come out completely misinformed.
 
This is also why you see more and more video encoding benchmarks showing up in reviews, because they are one of the few applications that will keep showing noticeable performance improvements year-over-year..

Plenty of people do video encoding. Many of them don't even realize it. If you're streaming your gameplay, you're doing video encoding. In real time while playing a game. And hammering your CPU unless you're using ShadowPlay.

All I can say about the rest of your post is by your logic, we should all still be using Nehalem or those amazing AMD FX CPUs. The reality is that these benchmarks with the game running at 640x480 or something are there just to show the raw CPU IPC improvements but there's plenty of benchmarks with a sane resolution which demonstrate that frame times are much improved started with Sandy Bridge and then Ivy Bridge, Haswell, and now Skylake. Past a certain point, yeah, you spend 0 frames beyond 16ms and you're locked at 60 fps. There are games out there where you don't meet that particular performance requirement until beyond Sandy Bridge.

fc4-16ms.gif
 
If I have absolutely no PC right and was planning to get one around the holidays, should I grab the i7 or try and get the last gen for a fresh build?

I'm putting a 980 Ti into it
 
If I have absolutely no PC right and was planning to get one around the holidays, should I grab the i7 or try and get the last gen for a fresh build?

I'm putting a 980 Ti into it

Only get previous generation hardware if you can find someone selling it second hand for cheap. Intel doesn't lower prices for older inventory.
 

Tovarisc

Member
Do you ever wonder why sites don't benchmark how these CPUs perform when running on Mars? That higher temperature would probably skew the results, don't you think?

I'm not trying to be flippant; I'm just trying to demonstrate the point that just because we can come up with a benchmarking scenario that shows a difference doesn't mean that it is a useful one to focus on, unless of course you are writing for marsgamer.com.

To me, CPU tech is really only improving when it is getting better at doing the things its audience actually does already or would like to do. I do acknowledge there is some grey area here, where it may be useful to try to predict how a product might perform in some future scenario that is not here yet, but personally I just don't see those benchmarks you posted as providing any useful insight in that way. If anything, it seems like the trend might be toward less reliance on the CPU if DX12 actually provides any of the benefit that is claimed.

The problem there is that people are not well-informed on the fact that games are mostly GPU-bound. The appropriate solution is not to trick them into thinking that games are usually CPU-bound by coming up with some contrived scenario that does not reflect the real world.

Honestly, how big is the audience of people who own a monitor that does > 120 Hz, want to play at > 120 fps in 1080p rather than going to a higher resolution, and also own/want a K processor but don't want to overclock it? Less than 1%?

It might not have been so egregious if Eurogamer had spent more time showing the more realistic scenarios where the game is GPU bound before saying "by the way, if you are in this extremely niche audience, then it would help you". Yet it was the only comparison they did in the entire review. If that was the only review you read, you would come out completely misinformed.

Higher temps on Mars and marsgaming.com? wut?

Also EuroGamer underlines it that they are running settings that force CPU limitation so they can bench how Skylake performs when performance is CPU limited. In videos they even say that you very likely won't run such settings in real (read: GPU limited) gaming, but that these benches are about CPU.

How that is tricking people and misleading them? They tell people what they are doing with these benches and how they do it. If person reading and/or watching bench videos doesn't understand what is told to him/her then it just isn't EuroGamers fault at that point.

Same kind state of being misinformed could be reached by reading bench where Skylake is ran couple with worst possible DDR4 out there.

There are many, well most, benches out there that run gaming benches with settings that are GPU limited. These show Skylake to be few percentile performance increase in gaming in general, in some better, worse or same as prev-gen. Some of these benches also show that Skylake gets more performance increase and lead over prev-gen when using faster (2666+MHz) DDR4.

Maybe I should have dug up that stuff on top of EuroGamer, even tho I gave link to post filled with bench links, but I singled out EG as it shows off so well how Skylake performs when CPU power is needed by game.
 
I want them to benchmark CPUs in scenarios that are realistic for the people who are thinking about purchasing them. If that scenario happens to show no appreciable difference, that is a good thing for people to know. Telling the reader that a product is not worth buying is arguably the most important information you can give them. Unfortunately most people reading that review are hoping that the new tech will be awesome, so there is this perverse incentive for review sites to try to demonstrate the biggest differences they possibly can.

Rather than caring about informing the reader's decision about whether or not to purchase in a meaningful way, these sites get it in their head that they need to come up with a scenario that shows a difference between the products they are reviewing because otherwise the readers will not bother clicking in the future. Who wants to read a review of a CPU when it just says that it performs nearly identically the every CPU from the last 5 years? This is also why you see more and more video encoding benchmarks showing up in reviews, because they are one of the few applications that will keep showing noticeable performance improvements year-over-year.



Do you ever wonder why sites don't benchmark how these CPUs perform when running on Mars? That higher temperature would probably skew the results, don't you think?

I'm not trying to be flippant; I'm just trying to demonstrate the point that just because we can come up with a benchmarking scenario that shows a difference doesn't mean that it is a useful one to focus on, unless of course you are writing for marsgamer.com.

To me, CPU tech is really only improving when it is getting better at doing the things its audience actually does already or would like to do. I do acknowledge there is some grey area here, where it may be useful to try to predict how a product might perform in some future scenario that is not here yet, but personally I just don't see those benchmarks you posted as providing any useful insight in that way. If anything, it seems like the trend might be toward less reliance on the CPU if DX12 actually provides any of the benefit that is claimed.



The problem there is that people are not well-informed on the fact that games are mostly GPU-bound. The appropriate solution is not to trick them into thinking that games are usually CPU-bound by coming up with some contrived scenario that does not reflect the real world.

Honestly, how big is the audience of people who own a monitor that does > 120 Hz, want to play at > 120 fps in 1080p rather than going to a higher resolution, and also own/want a K processor but don't want to overclock it? Less than 1%?

It might not have been so egregious if Eurogamer had spent more time showing the more realistic scenarios where the game is GPU bound before saying "by the way, if you are in this extremely niche audience, then it would help you". Yet it was the only comparison they did in the entire review. If that was the only review you read, you would come out completely misinformed.

There are still plenty of games where you get against CPU bottlenecks when you try to go for 120 FPS and it seemed relevant for all but one or two of the games that were benchmarked.

With these numbers you can determine your bottlenecks. You can look whether your GPU gets higher than those framerates. When it does, you might get CPU bottlenecked at some point. If you would have everything just showing the same framerate, you can't determine that.

You are suggesting something completely useless. By showing your normal average game on normal settings, there would be little to no difference with all of the CPUs. So, okay, now I know it makes no difference if I do that. I also have no other information about how the CPU compares to other ones and how good of a product it is. And then at some point my CPU does get to weak to run games. The current numbers give the same information.

You can just as easy put a line in every CPU review with: "If the framerate of the CPU is above 60 in a benchmark, it likely does not matter for you."

Instead of having numbers that everybody can use and apply to their own situation, you want to make them only relevant for a subset of the users. You also lose with the huge aspect of reviews which is to compare them with the other offerings.

When you look at CPU reviews and are using benchmarks then it is your responsibility to learn when a CPU purchase is worthwhile to you. Instead of having every review needing to dumb themselves down because their readers can't inform themselves. This is applicable to every product.
 

tokkun

Member
Plenty of people do video encoding. Many of them don't even realize it. If you're streaming your gameplay, you're doing video encoding. In real time while playing a game. And hammering your CPU unless you're using ShadowPlay.

That's fine. I have no problem with having video encoding benchmarks in reviews. I was noting how the number of them seems to keep increasing each year.

All I can say about the rest of your post is by your logic, we should all still be using Nehalem or those amazing AMD FX CPUs. The reality is that these benchmarks with the game running at 640x480 or something are there just to show the raw CPU IPC improvements but there's plenty of benchmarks with a sane resolution which demonstrate that frame times are much improved started with Sandy Bridge and then Ivy Bridge, Haswell, and now Skylake. Past a certain point, yeah, you spend 0 frames beyond 16ms and you're locked at 60 fps. There are games out there where you don't meet that particular performance requirement until beyond Sandy Bridge.

fc4-16ms.gif

I would respond to your first point, but I can't decipher what argument you are trying to pin to me.

As for frame time comparisons, I think they are great. As long as they are collected using a sensible hardware configuration and game settings, they can demonstrate CPU impacts that occur in realistic situations and are actually perceptible to normal users. That is exactly what I have been advocating. I certainly find them more useful than comparing average frame rates above 120 fps, which I would like to point out is the thing I was criticizing since it seems I'm now having to fend off accusations of being some kind of Luddite in addition to that.
 
If I have absolutely no PC right and was planning to get one around the holidays, should I grab the i7 or try and get the last gen for a fresh build?

I'm putting a 980 Ti into it

I would seriously save a wad of money and get a 4790K over the 6700K.

Only get previous generation hardware if you can find someone selling it second hand for cheap. Intel doesn't lower prices for older inventory.

Wrong, as the 4790K can be had for a lot cheaper than the 6700K right now. At least in the UK. I expect the same for US.

4790K - £259.99
6700K - £319.99

And with the 6700K being slightly slower or a wash (depending on what benches you believe) than the 4790K in gaming, and around only 5-10% faster than the Haswell processor in general CPU tasks, it isn't worth the premium. You'll save money with the 4790K with the mobo and DDR3 memory, too (perhaps as much as £150)

Makes sense to buy a cheaper 4790K system to last 3 years than to buy a more expensive Skylake 6700K one to last the same time. The 4790K is going to be almost identical in its performance over that period and in either case, you'll be upgrading in 3 years (if you so please).
 

FireFly

Member
To me, CPU tech is really only improving when it is getting better at doing the things its audience actually does already or would like to do. I do acknowledge there is some grey area here, where it may be useful to try to predict how a product might perform in some future scenario that is not here yet, but personally I just don't see those benchmarks you posted as providing any useful insight in that way. If anything, it seems like the trend might be toward less reliance on the CPU if DX12 actually provides any of the benefit that is claimed.
So, I am about to upgrade my PC after 6 years (still using an i7 920). I want to know if buying a Skylake CPU will give me any more future proofing than buying a Haswell CPU. The benchmarks from 3 years in the future don't exist yet, but I am buying a new PC now.

Now what you seem to be suggesting is that it is better to have no knowledge of the performance potential of a new CPU than to have knowledge that may turn out to be inaccurate. Suppose in CPU limited applications Skylake is 50% faster. Is this irrelevant to my decision as to whether to buy Skylake over Haswell?
 

cyen

Member
Plenty of people do video encoding. Many of them don't even realize it. If you're streaming your gameplay, you're doing video encoding. In real time while playing a game. And hammering your CPU unless you're using ShadowPlay.

All I can say about the rest of your post is by your logic, we should all still be using Nehalem or those amazing AMD FX CPUs. The reality is that these benchmarks with the game running at 640x480 or something are there just to show the raw CPU IPC improvements but there's plenty of benchmarks with a sane resolution which demonstrate that frame times are much improved started with Sandy Bridge and then Ivy Bridge, Haswell, and now Skylake. Past a certain point, yeah, you spend 0 frames beyond 16ms and you're locked at 60 fps. There are games out there where you don't meet that particular performance requirement until beyond Sandy Bridge.

fc4-16ms.gif

There is something wrong about that graph, a 5860X performing so "bad"?
 
Top Bottom