• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Leak pegs desktop Broadwell, Skylake for mid-year

elyetis

Member
No, it was always planned for around '16. Big Maxwell is coming first this year.
Just checked news from the GTC 2014 and you are absolutly right, I think I keep making that mistake because the unified memory was first supposed to be part of maxwell .
 

LeleSocho

Banned
Oh I missed yet another Pascal delay ?

It's been like that since they officially launched Maxwell
ZD5tiiw.png


Edit: i'm late :p
 
thinking about upgrading my 3770k and 680 at the end of the year, but I might just buy a better cooler and OC that bitch a little higher and just step up the GPU when the new 20nm come out
 
thinking about upgrading my 3770k and 680 at the end of the year, but I might just buy a better cooler and OC that bitch a little higher and just step up the GPU when the new 20nm come out

There probably won't be 20nm GPUs this year, but still 28nm, and they go down to 16nm next year.
 

SliChillax

Member
I have a 2600k oced to 4.4ghz. I have that upgrade itch, but I want to go from 4 cores to 8 cores and ddr 4 ram as well. I hope Skylake is the upgrade that Sandy Bridge was.

Same, I'll upgrade once DDR4 isn't so expensive. Probably somewhere in Q4 2015 or 2016.
 

Exuro

Member
Another i7 920 hold out here. Mobo has been having problems but I think I can last until skylake. At least that's the plan.
 

LordOfChaos

Member
Wow, I was planning on getting a Skylake MacBook Pro in 2016. Looks like I'll need to start saving up a lot earlier than that ._.

I have the Haswell one, I'm half contemplating selling it for Skylake, the upgrade shouldn't cost much with how much these hold resale value. Depends on what they show of it. If the IGP is a buttload better than the Iris Pro 5200 in this, I might do it. 20% from Broadwell and 50% from Skylake, I'd be looking at 80% more performance if those claims hold up. It can already run every modern game besides Ryse that I've thrown at it.
 

Jedi2016

Member
My next upgrade's going to require a new mobo regardless, so I might as well get the latest and greatest when the time comes.
 

Fractal

Banned
Yeah, I'm counting on that. Will go for Skylake as a replacement for my 2600K. It served me well, but I think it's slowly outstaying its welcome.

In any case, CPU releases may not be too exciting these days due to fairly small performance improvements, but if you pick up a good one you'll be set for a while, especially if you overclock.
 

Human_me

Member
Another in the i7 9x0 club, will get a new one probably Q4 2015 or Q1 2016, at least 6 cores or bust.

Yeah I'll props go for a 6 core one as well (if they come out).

Alternatively I'll get Broadwell or Haswell if the price is low on them when Skylake arrives.
 
I'm looking at a huge upgrade myself.

I'm running an i7 920 on a now 5 year old PC.


This is my upgrade path:

DDR3 - DDR4
PCIe 2.0 - PCIe 3.0
HDD SATA 2.0 - SSD SATA 3.0
USB 2.0 - USB 3.0
Windows 7 - Windows 10


I'm going to stick with my GTX 770 4GB for now.

But just about everything else in my case is getting replaced this year. Including my OS. Should be pretty frightening lol
 
skylake 20 percent IPC gain over haswell?...
New node, new architecture and that's it?

moore's law hasn't applied for years in cpu land and it seems it isn't going to in the near future either.
At this rate in ten years we'll have cpus that are 2.5x faster than haswell... Why even bother "upgrading" at all anymore

Technological progress at a virtual standstill
 

Manoko

Member
I'm planning (and currently started saving) to build a completely new rig while Skylake becomes available (so early 2016 right ?) to truly enjoy the consumer version of the Oculus Rift.

I currently have a Q6600, DDR2 ram, and a HD5850.

I believe the jump in performance will be pretty significant, to say the least... I can't wait. :)
 

LeleSocho

Banned
skylake 20 percent IPC gain over haswell?...
New node, new architecture and that's it?

moore's law hasn't applied for years in cpu land and it seems it isn't going to in the near future either.
At this rate in ten years we'll have cpus that are 2.5x faster than haswell... Why even bother "upgrading" at all anymore

Technological progress at a virtual standstill

New node, new architecture, ddr4 standard, new vp9/h265 encoder and decoder, thunderbolt 3 etc.
It's great imho
 

Grief.exe

Member
skylake 20 percent IPC gain over haswell?...
New node, new architecture and that's it?

moore's law hasn't applied for years in cpu land and it seems it isn't going to in the near future either.
At this rate in ten years we'll have cpus that are 2.5x faster than haswell... Why even bother "upgrading" at all anymore

Technological progress at a virtual standstill

ARM processors are still seeing solid gains. X86 and consoles have completely stagnated due to lack of competition.
 

LordOfChaos

Member
skylake 20 percent IPC gain over haswell?...
New node, new architecture and that's it?

moore's law hasn't applied for years in cpu land and it seems it isn't going to in the near future either.
At this rate in ten years we'll have cpus that are 2.5x faster than haswell... Why even bother "upgrading" at all anymore

Technological progress at a virtual standstill


Yes, it has. Moores Law is about transistor counts, not performance. Those transistors can go towards performance, or dedicated features, or they can go towards power consumption, and the trend has just been to the two latter. People just spread the wrong definition of the law across the interweb.

ARM processors are still seeing solid gains. X86 and consoles have completely stagnated due to lack of competition.

Performance has stagnated in favour of power draw. Laptops outsell desktops by an order of magnitude. If AMD was strong, Intel would still be focused on power draw. Their internal rule is that anything that increases power draw by 1 has to increase performance by 2, or it does not go in.
 

JaseC

gave away the keys to the kingdom.
I really need to brush up on computer hardware, I'm so out of the loop.

Here's a crash course:

CPU: Intel > AMD
GPU: Nvidia ~ AMD, with the direction of the swing chiefly depending on the importance of PhysX and day-one drivers

But as always, once you've nailed down a budget, it's best to seek an informed consensus.
 
skylake 20 percent IPC gain over haswell?...
New node, new architecture and that's it?

moore's law hasn't applied for years in cpu land and it seems it isn't going to in the near future either.
At this rate in ten years we'll have cpus that are 2.5x faster than haswell... Why even bother "upgrading" at all anymore

Technological progress at a virtual standstill

Moore's law is still happening when you go by transistor count.
 

The Boat

Member
Here's a crash course:

CPU: Intel > AMD
GPU: Nvidia ~ AMD, with the direction of the swing chiefly depending on the importance of PhysX and day-one drivers

But as always, once you've nailed down a budget, it's best to seek an informed consensus.
The budget's always the trickiest part :p Thanks.
 
Realistically, all of the 14 nm processors should be great from a performance/watt perspective. Not sure if I'll pull the trigger for the broadwell or wait another few months for a skylake, I'm guessing Intel will make the decision for me by what CPUs they actually put on the market at 4 core + low TDP.

But if all people look at is clock speed, then...

Sorry? ¯\_(ツ)_/¯
 
Here's a crash course:

CPU: Intel > AMD
GPU: Nvidia ~ AMD, with the direction of the swing chiefly depending on the importance of PhysX and day-one drivers

But as always, once you've nailed down a budget, it's best to seek an informed consensus.

It probably doesn't matter to many people, but Nvidia has much better power consumption : performance ratio.
 
Moore's law is still happening when you go by transistor count.

Yes, it has. Moores Law is about transistor counts, not performance. Those transistors can go towards performance, or dedicated features, or they can go towards power consumption, and the trend has just been to the two latter. People just spread the wrong definition of the law across the interweb.



Performance has stagnated in favour of power draw. Laptops outsell desktops by an order of magnitude. If AMD was strong, Intel would still be focused on power draw. Their internal rule is that anything that increases power draw by 1 has to increase performance by 2, or it does not go in.

Performance is at a standstill, happy now? you know very well what I meant.
Any transistor count gains are wasted on shitty integrated gpu performance in a high end bracket desktop cpu that most people are going to use a dedicated gpu with anyhow...

Also the moment amd could not compete anymore, intel simply shrunk their die size when they shrunk the process node (selling effectively lower end parts as the high end)
We've gone from 270-300mm dies to 150-170mm ones... what's the point of smaller transistors if it doesn't result in getting more of them NOR in getting the same amount of them for less money?

This is a gaming forum and we are getting fucked hard by intel now they have a monopoly in the high end.
Moore's law is meaningless to the consumer if it doesn't equate performance/dollar

20 percent more IPC on skylake, count on 20 percent price hike to go with it.

Skylake was the last hope for being the first decent cpu jump in 4 years time.
 

LordOfChaos

Member
Performance is at a standstill, happy now? you know very well what I meant.
Any transistor count gains are wasted on shitty integrated gpu performance in a high end bracket desktop cpu that most people are going to use a dedicated gpu with anyhow...

*Shrugs*

Don't get upset so fast, just explaining. Moores Law is alive and well, so declaring its death was incorrect, and it happens all too much online.

Using that die space for more cores would be better for us enthusiasts pairing those with discreet GPUs, but Intel isn't full of idiots, for whatever combination of reasons they decided bolting on graphics was more beneficial to them and the majority of customers.

This is a gaming forum and we are getting fucked hard by intel now they have a monopoly in the high end.

Again, they'd be focused on power consumption whether or not AMD is strong, with laptops far outselling desktops, and even laptops being gobbled away at by smaller devices. So they've thrown more transistors at having similar performance at lower power draw, rather than more performance, for the last few generations.
 
*Shrugs*

Don't get upset so fast, just explaining. Moores Law is alive and well, so declaring its death was incorrect, and it happens all too much online.

Using that die space for more cores would be better for us enthusiasts pairing those with discreet GPUs, but Intel isn't full of idiots, for whatever combination of reasons they decided bolting on graphics was more beneficial to them and the majority of customers.



Again, they'd be focused on power consumption whether or not AMD is strong, with laptops far outselling desktops, and even laptops being gobbled away at by smaller devices. So they've thrown more transistors at having similar performance at lower power draw, rather than more performance, for the last few generations.

Self fulfilling prophecy, people have been given no reason to upgrade their desktops for years now.

I'm not upset at you, I'm upset at the effect this is going to have on gaming. (and the fact that there is no value for consumers anymore )
144 hz monitors? what's the point when cpus aren't good enough for them in newer games and won't be for the next ten years. Unless you want to shell out a 1000 euros for a 6 or 8 core mislabeled "intel extreme" cpu that should cost a quarter of that.
The entry cost of gaming goes up for the low end, in 2009 when amd could compete a 130 euro cpu and 50 euro motherboard and 30 euros worth of ram got you into 60 fps higher end gaming, now that's 400 euros combined to achieve the same thing as an i3 does not cut it for that.

Monopolies fucking suck.

Btw what does 'focusing on power consumption' even MEAN
As if performance/watt hasn't similarly improved every time there was a new process node and architecture in the past?
When you have a 300mm^2 120watt high end you would still have lower power parts, just like always, it's called the lower end.
'focusing on power consumption' is marketingspeak for selling you low end shit at a premium price.
What is better? Being told you're sold high end parts and paying out the wazoo for them? Or buying a lower end (in name) part and having higher end options available?

It's like if amd released the hd4770 in 2009 , priced it at 400 euros and said 'we are focusing on power consumption'
 

JaseC

gave away the keys to the kingdom.
It probably doesn't matter to many people, but Nvidia has much better power consumption : performance ratio.

Yeah, that's why I said "chiefly": enhanced visual effects and better day-one support are things people are more likely to take into account versus one's yearly electricity bill.
 

LordOfChaos

Member
Btw what does 'focusing on power consumption' even MEAN
As if performance/watt hasn't similarly improved every time there was a new process node and architecture in the past?
When you have a 300mm^2 120watt high end you would still have lower power parts, just like always, it's called the lower end.
'focusing on power consumption' is marketingspeak for selling you low end shit at a premium price.
What is better? Being told you're sold high end parts and paying out the wazoo for them? Or buying a lower end (in name) part and having higher end options available?

It's like if amd released the hd4770 in 2009 , priced it at 400 euros and said 'we are focusing on power consumption'


They used to be designed for desktop first,now they're designed for laptops first. You'll note that under the old model we were lucky with 3-4 hour battery lives, even going between generations since any power gain went to performance rather than energy frugality, while now 7-8 for desktop replacements and over 11 for ultrabooks is happening.

The 4770 analogy doesn't stand at all, since they aren't in the same mindset of laptop first for discreet GPUs. But those are going to be moving that way, Nvidia will now develop for mobile (tablets/phones) first, and then scale up for desktop.

Thing is, they'll have an easier time of scaling up, since with GPUs if you have enough memory bandwidth, more cores/ROPs/TMUs will fall just short of linearly scaling performance. On CPUs it's not so easy, though you could throw more cores at the problem (though again, games are hardly using 4 well more often than not - I do hope DX12 helps).


Just to be clear, I'm not saying I like it this way, nor am I trying to do marketingspeak. I'm just saying this is the way it is, Intel has said so themselves. One could see it as throwing enthusiasts under the bus, but that's a much smaller market than that for ultrabooks and notebooks and tablets etc.
 
what's the point of smaller transistors if it doesn't result in getting more of them NOR in getting the same amount of them for less money?

The benefit of increased transistor count with respect to modern processors is reduced power consumption, allowing higher-end processors to end up in smaller form factors. Were you not paying attention when Core M was announced? Nvidia is doing the same thing with Maxwell. The whole point of the architecture design was mobile first.

I also find it strange that you are complaining about Intel's allegedly slow innovation for gaming... when they've put a huge emphasis on integrated GPUs over the last few years... and gaming is much more likely to be bottlenecked by a GPU than a CPU.

Btw what does 'focusing on power consumption' even MEAN
As if performance/watt hasn't similarly improved every time there was a new process node and architecture in the past?

Hell no power consumption has not been as much of a priority before; increasing clock speed was. The entire tech world U-turned once the iPhone and iPad became mainstream products.

'focusing on power consumption' is marketingspeak for selling you low end shit at a premium price.

No, it means the opposite; rapid improvements on the low-end for minimal cost increase... along with improved battery life for more mobile devices. The latest 13-inch Macbook Air didn't get a 12 hour battery life by magic.

Call it what you want, but Intel is definitely not at a standstill and can't even afford to.
 

Durante

Member
ARM processors are still seeing solid gains. X86 and consoles have completely stagnated due to lack of competition.
x86 hasn't really stagnated. The E5-2699s we have at work are amazingly fast :p

I also find it strange that you are complaining about Intel's allegedly slow innovation for gaming... when they've put a huge emphasis on integrated GPUs over the last few years...
I think he's talking about real enthusiast gaming, not that.
 
Yeah, I'm not replacing my OC'd i5-3570K until at least Cannon Lake, especially after spending money to get a GTX 970 to replace my old GTX 560 Ti 448 Core unless there's an affordable 6 core Skylake CPU or something.
 

thelastword

Banned
I'll be in the market for a new PC come December, I was thinking of picking the best parts available at that point, seems like I'm in for a treat then.

I guess I should be in-line for a DX12 card, a new mb chipset supported by a new processor amongst many things.
 
Top Bottom