• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Skylake review thread

longdi

Banned
Wonder if my i5-4690 will be good for the next say 5- years. Not really interested in anything above 1080p/60fps.

The way i see it.
If you use a 2500K or a lesser CPU, Skylake is worth the upgrade consideration.
If you are using 2600K, then you should go straight to X99 + 5820K for the most bang.
Reviews pit Skylake overclocking at 4.6-4.8Ghz, a slight improvement over Ivy and Haswell. I wonder once people starts delidding it, how much higher it can go

Skylake isnt that bad. Remember when X58 CPU shows big gains over Core 2 in only encoding tasks? Over time, with some overclocking, X58 pulled away from the venerable Core 2, in more things, like new generation of games.

Of course Intel had to drop some features with consumer Skylake. Features like AVX3, TSX , who knows how much more that would have helped?
 

Man

Member
They had to drop TSX for consumer versions? 6700K?
First time I'm hearing this concerning Skylake. Tried Googling but no results. It was dropped (patched out) for Haswell due to bug.
 

Grassy

Member
My i7-2600k + Asus P8Z68-V Pro is probably the best 1-2 combo in PC building I've ever purchased. I've had them going on 4 years and, like you, simply had to upgrade the graphics card. Amazing value.

I've had my i7 2600k(4.5ghz) + Asus P8P67 Pro for 4+ years and they're still going strong. The only thing I've upgraded is the video card, from a HD6970 to 670 sli 2 years ago, to 980 Ti sli recently.
I was set to do a new build with Skylake, but at this stage I'm not upgrading until either my memory, mobo or cpu dies. It's just not worth it.
 

Thrakier

Member
I've had my i7 2600k(4.5ghz) + Asus P8P67 Pro for 4+ years and they're still going strong. The only thing I've upgraded is the video card, from a HD6970 to 670 sli 2 years ago, to 980 Ti sli recently.
I was set to do a new build with Skylake, but at this stage I'm not upgrading until either my memory, mobo or cpu dies. It's just not worth it.

Yeah, jump is too slow. I'm using an oc i52500k just and still I can play a beast like TW3 at almost max settings at 1080/60 by upgrading just my GPU.
 

Evo X

Member
I've had my i7 2600k(4.5ghz) + Asus P8P67 Pro for 4+ years and they're still going strong. The only thing I've upgraded is the video card, from a HD6970 to 670 sli 2 years ago, to 980 Ti sli recently.
I was set to do a new build with Skylake, but at this stage I'm not upgrading until either my memory, mobo or cpu dies. It's just not worth it.

Dude, your 980Ti SLI setup is definitely being held back by the PCI-E 2.0 x8 setup on that motherboard. You would notice a pretty big jump by moving to a newer build.

How long is it till Skylake-E launches?

One year. Planned for Fall 2016 currently.
 
I'm bummed about performance. A year ago I was really looking forward to skylake. A mishap with my 2500k forced me to upgrade a few months ago though, but since I still had a hope Skylake would do well, I held back my motherboard spending just in case. Guess I should have just gone all out because with this progress my 4790k is going to last longer than a pet turtle.
 

LilJoka

Member
Some interesting benches for you.

Civ, which is n-threaded

civ-99th.gif

civ-16ms.gif


Project Cars, which is notoriously CPU heavy

pcars-99th.gif

pcars-curve.gif

pcars-8ms.gif


For 120Hz gaming, I'd say Skylake is finally a worthy upgrade to Sandy for the performance and features.

Hardly anyone is going to be running a 2500k at stock clocks though. And although these games are well threaded, you are still comparing against CPUs that turbo to 4.4Ghz.
 

orochi91

Member
My overclocked 4790k is gonna last me a long time, at least judging by these benchmarks.

AMD, this is your chance!

Pounce on it!
 

Tovarisc

Member
My motherboard is slowly dying, and it pains me to think how much I'm going to have to spend for such a little performance improvement over my current Ivy Bridge setup. I can't believe we won't see the next real stop up in performance until almost 2018.

This no competition thing sucks balls. Just like it did in the 90s.

Competition isn't some magic bullet that makes development tech 10x easier and faster. Could we see maybe 1% more improvement / gen? Maybe, but then think about all insane challenges Intel is facing now within 14-10nm production. They are almost counting atoms by now.

People expected way way too much from Skylake.
 

ZanDatsu

Member
Besides improvements measured by the benchmarks we've seen so far, aren't there any other features -- besides DDR4 -- that this new architecture is supposed to allow for? I know Intel will be having a conference sometime this month to talk more about this kind of thing, but I'm not too sure if it'll reveal anything of particular interesting to the gaming community.

Still, I was always more looking forward to what Skylake would bring to the laptop space, so I can't say I'm too disappointed about the fact I won't need to upgrade my desktop for a while.
 

Tovarisc

Member
Besides improvements measured by the benchmarks we've seen so far, aren't there any other features -- besides DDR4 -- that this new architecture is supposed to allow for? I know Intel will be having a conference sometime this month to talk more about this kind of thing, but I'm not too sure if it'll reveal anything of particular interesting to the gaming community.

Still, I was always more looking forward to what Skylake would bring to the laptop space, so I can't say I'm too disappointed about the fact I won't need to upgrade my desktop for a while.

If interested to know more about why some people are excited about Skylake and Z170 then read this http://forums.anandtech.com/showpost.php?p=37333562&postcount=1 after which you need do some googling on both parts. Even if Skylakes as CPU's didn't blow minds across the globe yesterday it's still 14nm processor with good handful of nice platform upgrades. In both, CPU and chipset, fronts.
 
I'm planning on upgrading my 2500k because my mobo sucks, but I don't know if I should go with Skylake or a 5820k. Do games take advantage of the two extra cores? Doesn't look like it matters much when looking at the benchmarks posted above.
 

Tovarisc

Member
I'm planning on upgrading my 2500k because my mobo sucks, but I don't know if I should go with Skylake or a 5820k. Do games take advantage of the two extra cores? Doesn't look like it matters much when looking at the benchmarks posted above.

Well now that Skylake fell to its face, in some eyes, people have been drumming 6core CPU's up as "future proof as DX12 will give them huge performance boosts"! Will that actually happen and will DX12 actually give 6+ core CPU's very noticeable edge few years from now is anyones guess at the end. If I were looking at 5820K vs Skylake I would be asking myself if I'm doing enough multitasking and/or cores demanding stuff (e.g. video editing and encoding) to get benefits from investment now and within next 12 months to justify 5820K.

Basically 5820K has performance edge if you are doing stuff that gain benefits from cores/threads and marginal gaming boost from having 2x 16x PCI-E's (depending on mobo) if you are running SLI. In gaming having 6 cores instead of 4 means next to nothing as of now, afaik.
 

Grassy

Member
Dude, your 980Ti SLI setup is definitely being held back by the PCI-E 2.0 x8 setup on that motherboard. You would notice a pretty big jump by moving to a newer build.

What I've read about the differences in pci-e 2.0 v 3.0 and x8/x16 lanes suggests there's a negligible difference if any. Sometimes up to 5fps, give or take a few. Although a new chipset and it's features are something other than framerates to think about as well.
 
Not guaranteed for locked 60fps, but cool beans if you tolerate sporadic dips to 50s.
Actually I am totally fine with dialing resolution down. Even 720p is pretty great for me. Good framerate is better but dips to 50s from time to time are totally fine by me.
 

Evo X

Member
What I've read about the differences in pci-e 2.0 v 3.0 and x8/x16 lanes suggests there's a negligible difference if any. Sometimes up to 5fps, give or take a few. Although a new chipset and it's features are something other than framerates to think about as well.

X8 PCI-E 3.0 is fine for now because it has the same bandwidth as X16 PCI-E 2.0. But X8 PCI-E 2.0 is pretty slow and definitely hampers the 980Ti.

I know because I have a Z68 motherboard, which is very similar to your P67, and it held back my Titan X SLI. Not much of a point in splurging on such as powerful GPU setup if it's being starved for bandwidth. I'm thinking about switching to a 5930K setup so I can run both cards(and future GPUs) in full 16X 3.0 speed along with a pair of M.2 PCI-E x4 SSDs without any worry.

Even if you switch to Skylake instead of Haswell-E, you will still get twice the GPU bandwidth you do now.
 

Finalizer

Member
Wonder if my i5-4690 will be good for the next say 5- years. Not really interested in anything above 1080p/60fps.

Resolution/framerate is already much more heavily tied to GPU than CPU in most cases, so there's probably no need to worry about upgrading any time soon if Intel keeps going at the rate they've been at. I'm planning to run my own 4690k for at least half a decade, unless something really shakes things up on the CPU side before then.
 
Personally waiting for mobile benchmarks before making any conclusions. Lots of neat new features on the desktop platform, but not much of a speed improvement. Assuming Kaby Lake is going to be Sky Lake's Devil's Canyon, that's what desktop enthusiasts should be waiting for.

It personally feels like to me that the delay in the improvements featured in Kaby over Sky are artificial.
 

Tovarisc

Member
Eurogamer ran benches @ 1080p

Core i5 6600K vs Core i5 4690K (same 3.5-3.9GHz base/turbo)
  • 17% faster @ The Witcher 3
  • 1% faster @ GTA V
  • 10% faster @ Battlefield 4

Core i5 6600K vs Core i5 3570K (3.5-3.9GHz vs 3.4-3.8GHz base/turbo)
  • 22.4% faster @ The Witcher 3
  • 20.6% faster @ GTA V
  • 18.1% faster @ Battlefield 4

Core i5 6600K vs Core i5 2500K (3.5-3.9GHz vs 3.3-3.7GHz base/turbo)
  • 25.8% faster @ The Witcher 3
  • 31,7% faster @ GTA V
  • 25% faster @ Battlefield 4


http://www.eurogamer.net/articles/digitalfoundry-2015-intel-skylake-core-i5-6600k-review
 

LilJoka

Member
Eurogamer ran benches @ 1080p

Core i5 6600K vs Core i5 4690K (same 3.5-3.9GHz base/turbo)
  • 17% faster @ The Witcher 3
  • 1% faster @ GTA V
  • 10% faster @ Battlefield 4

Core i5 6600K vs Core i5 3570K (3.5-3.9GHz vs 3.4-3.8GHz base/turbo)
  • 22.4% faster @ The Witcher 3
  • 20.6% faster @ GTA V
  • 18.1% faster @ Battlefield 4

Core i5 6600K vs Core i5 2500K (3.5-3.9GHz vs 3.3-3.7GHz base/turbo)
  • 25.8% faster @ The Witcher 3
  • 31,7% faster @ GTA V
  • 25% faster @ Battlefield 4



http://www.eurogamer.net/articles/digitalfoundry-2015-intel-skylake-core-i5-6600k-review

This almost sounds too good to be true
 

Rafterman

Banned
If we remember that it's a 2 generation jump (incl broadwell), it's not thaaat exciting.

Not to mention the fact that people with the K series chips are probably overclocking them pretty easily. Considering the added cost of a new motherboard and ram, and that fact that I have a chip that sits comfortably at 4.8ghz I don't see the point in upgrading for a while. I'd get much better value out of buying more GPU.

I can't remember the last time my CPU lasted this long. I bought my chip in Feb. 2011. Four and a half years is insane for me. Sucks, if you'd have told me back then that we'd still be slogging along at these speeds back then I'd have laughed in your face.
 

daninthemix

Member
If we remember that it's a 2 generation jump (incl broadwell), it's not thaaat exciting.

For the cost, certainly not and that's only for people who aren't GPU-bound - which most of us are. Even those with Titan X's / 980 Tis will once again become GPU-bound by the next swathe of games and the cycle will repeat. I doubt many Titan / 980Ti owners are running at 1080p either.

In other words, the DF benchmarks do a good job of showing the differences between the processors, but not in a way that will affect many of us in reality because almost all of us are GPU-bound. Even when we're not - during those brief moments following the release of a new super-duper GPU - it's only months before a new game renders you GPU-bound again.
 

Mechazawa

Member
It's looking more and more like if I wanna upgrade my CPU at any point during this generation, I should just get another Haswell and save myself the motherboard cost.
 

Renekton

Member
For the cost, certainly not and that's only for people who aren't GPU-bound - which most of us are. Even those with Titan X's / 980 Tis will once again become GPU-bound by the next swathe of games and the cycle will repeat. I doubt many Titan / 980Ti owners are running at 1080p either.

In other words, the DF benchmarks do a good job of showing the differences between the processors, but not in a way that will affect many of us in reality because almost all of us are GPU-bound. Even when we're not - during those brief moments following the release of a new super-duper GPU - it's only months before a new game renders you GPU-bound again.
CPU does factor for 60hz or more. If I tweak settings to allow GPU to 60 with headroom to spare, the GPU is taken out of the framerate equation but the CPU may still choke to sporadic dips of low 50s.

An i5 4690 will still dip semi frequently in Witcher 3.
 

Darkkn

Member
I've been waiting for the reason to upgrade my 2500k, but seems like CPU's are not progressing much for the desktop use case :/ Fingers crossed for the AMD Zen models. Competition is sorely needed in CPU space.

Any info on when Skylake 6-core models are coming out? Maybe those would be worh upgrading to.. Why do all the desktop Intel processors even have GPU's in them? Complete waste of transistors for gamers.
 

Renekton

Member
Any info on when Skylake 6-core models are coming out? Maybe those would be worh upgrading to.. Why do all the desktop Intel processors even have GPU's in them? Complete waste of transistors for gamers.
There seems to be demand for those, e.g. Apple AIOs.
 

Shaldome

Member
6700K delidded

http://www.overclock.net/t/1568357/skylake-delidded



Once Intel stopped soldering the IHS and using shitty TIM we have seen this time and time again.

The TIM is not shitty and it is not the cause of the problem. Reason for the higher temperatures is, that the head spreader is not making full contact with the die so that you have air in between.
Although I agree that it is shitty for the people who want to overclock their CPUs and that the ones with soldered IHS do not have this problem.
 

GeoNeo

I disagree.
The TIM is not shitty and it is not the cause of the problem. Reason for the higher temperatures is, that the head spreader is not making full contact with the die so that you have air in between.
Although I agree that it is shitty for the people who want to overclock their CPUs and that the ones with soldered IHS do not have this problem.

The TIM is not of the highest quality people have delided for a while now and seen big temp drops by replacing the TIM and putting the IHS back on. I helped a friend delid his Haswell CPU and we saw great drop in temps.

Though, best method is still going direct die and using a die guard (shim) like the one MSI made for their z97 boards to insure you don't crack it.

Code:
[img]http://i.imgur.com/h5nXU6m.jpg[/img]
 
Gaming? Oh dear...From AT:There’s no easy way to write this.

Discrete graphics card performance decreases on Skylake over Haswell.

This doesn’t particularly make much sense at first glance. Here we have a processor with a higher IPC than Haswell but it performs worse in both DDR3 and DDR4 modes. The amount by which it performs worse is actually relatively minor, usually -3% with the odd benchmark (GRID on R7 240) going as low as -5%. Why does this happen at all?

Eurogamer ran benches @ 1080p

Core i5 6600K vs Core i5 4690K (same 3.5-3.9GHz base/turbo)
  • 17% faster @ The Witcher 3
  • 1% faster @ GTA V
  • 10% faster @ Battlefield 4

Core i5 6600K vs Core i5 3570K (3.5-3.9GHz vs 3.4-3.8GHz base/turbo)
  • 22.4% faster @ The Witcher 3
  • 20.6% faster @ GTA V
  • 18.1% faster @ Battlefield 4

Core i5 6600K vs Core i5 2500K (3.5-3.9GHz vs 3.3-3.7GHz base/turbo)
  • 25.8% faster @ The Witcher 3
  • 31,7% faster @ GTA V
  • 25% faster @ Battlefield 4



http://www.eurogamer.net/articles/digitalfoundry-2015-intel-skylake-core-i5-6600k-review

Anyone know what might be going on here?
 

Shaldome

Member
The TIM is not of the highest quality people have delided for a while now and seen big temp drops by replacing the TIM and putting the IHS back on. I helped a friend delid his Haswell CPU and we saw great drop in temps.

Though, best method is still going direct die and using a die guard (shim) like the one MSI made for their z97 boards to insure you don't crack it.

Code:
[img]http://i.imgur.com/h5nXU6m.jpg[/img]

Yes, but that is not mainly due to the new TIM (if you use a better TIM you can of course gain some more degrees), but because the heatspreader/cooler makes better contact with the die. In the end delidding helps, so it does not matter of why, if you just want better temps.
 

Tovarisc

Member

LilJoka

Member
Give these DigitalFoundry videos listen/watch, they explain stuff at beginning.

https://www.youtube.com/watch?v=JWxncqbe1H8
https://www.youtube.com/watch?v=WZ_5p9wd2dk

Basically if game hits CPU barrier and becomes CPU bound then Skylake will (noticeably) outperform previous CPU generations. In some games performance gains gotten by Skylake are larger than in others, and in some at same level or even few % worse.

Yes, but you would expect all their scenarios to be GPU limited. That's why the results surprised me.
 

Tovarisc

Member
Yes, but you would expect all their scenarios to be GPU limited. That's why the results surprised me.

Like they mention in video they have specific GPU + game settings combos to minimize performance impact GPU has and basically force games to get CPU bound. With overkill card like TitanX OC it's possible at normal gameplay resolution, many sites run similar tests at 640x480 etc. resolutions.
 
Top Bottom