• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Intel Details Haswell Microarchitecture, Desktop/Mobile Lineup Fully revealed

How do we expect Laptop versions to be priced? Ive been holding out to buy a new laptop so that the CPUs would be good enough for PS2/GC emulation on 720p/2xAA, given the speeds on these and the architectural improvements hopefully its enough.
 
Disappointed.

Apparently mobile version of haswell only will feature HD4600 which is just tiny bit improvement over HD4000. The real improvement which is HD5200 is desktop only.

What is the point of this move? Desktop users can add graphic card anytime they want. It is the mobile market that can really benefited from good IGP.

How do we expect Laptop versions to be priced? Ive been holding out to buy a new laptop so that the CPUs would be good enough for PS2/GC emulation on 720p/2xAA, given the speeds on these and the architectural improvements hopefully its enough.

You might actually just want to buy Ivybridge laptop when it becomes cheap
 
The big deal with the mobile ones is that battery life might be hugely impacted. There's all sorts of crazy tech they've implemented to ensure the chips sip power. The VRM moving on-die is a big part of that, because they'll be able to fine tune voltages, which means a wider range of speeds as well as volts supplied that sufficiently power the chip while not providing too much.

Anandtech did a big writeup on them a number of months back that is worth a gander.
 
I want to know how big that on-chip GPU memory is and see how well it works. I'm really curious.

Desktop CPU wise, I still see no reason to upgrade from my i7 920 (running at 3.6 GHz).

Heard it was 128MB or something like that. I think it was mentioned in one of the first episodes of the AnandTech podcast, but I'm not sure. We'll soon find out.

EDIT: Here's the quote from Anand

AnandTech said:
Haswell will do what Ivy Bridge didn't. You'll see a version of Haswell with up to 128MB of embedded DRAM, with a lot of bandwidth available between it and the core. Both the CPU and GPU will be able to access this embedded DRAM, although there are obvious implications for graphics.

http://www.anandtech.com/show/6355/intels-haswell-architecture/12

But nevermind Haswell. Skylake's where it's at.
 
So still no prices then?

I'm eagerly waiting for haswell to release so I can pick up a cheap second hand p68 mobo with a 2500k :p

The fact that they upped the numbers in the naming scheme (x6xxk instead of x5xxk) suggests they're going to sell these alongside ivy bridge and that there is going to be another price hike for the overclockable quad core :\
 
At least 16GB OMGDDR800.

I think most Nehalem/Bloomfield folks should wait for Ivy-E/Hawell-E. Whichever is the one that pops up in Nov/Dec. That'll be the time to jump.

Ivy-E won't be seeing very large upgrades if any at all. (Except the quad core being unlocked with 4820K.)

Also, none of the Ultrabook models (U/Y) will have any on-die iGPU mem.
 
Broadwell is going to be BGA, which means the chips will be soldered to the board.

Wow, this is pretty shocking really. I didn't realise this was happening so soon for Intel and motherboard manufacturers.

So, what really happens is that Intel save costs with the expense being passed onto the mobo manufacturers, who themselves then pass onto the consumer who end up taking the hit to the wallet.

Intel can then be more strict on the amount of products/prices they want on the market also.

Seems crazy to wait for this if you're wanting a high-end CPU, because those costs in this part of the market are going to be crazy compared to what is available now.


I could be all wrong on the above, but this is what I have gathered from some quick researching.



Edit - Seems like LGA will still be around for Broadwell in some form, and it is more likely the lower end CPU's going onto BGA.

According to a trusted source in the motherboard industry, select Broadwell chips will indeed come soldered onto desktop motherboards. Lower-end models might not be available in socketed configurations at all, it seems. Our source did, however, reaffirm Intel's position that socketed CPUs aren't being dropped completely. We were told socketed processors are on the roadmap until at least 2016.

Interestingly, our source said selling motherboards with soldered-on CPUs gives larger board makers an advantage over their smaller rivals. Intel's higher-volume customers will be able to pull processors from larger pools of chips, allowing them to cherry pick parts for higher-end products. Motherboard makers may be able to sell boards with pre-overclocked CPUs—or at least with chips that have proven clock speed headroom.

RMAs will be more complicated with soldered-on CPUs, and it sounds like the details are still being worked out on that front. Our source said mobo makers may have to handle replacing damaged CPUs themselves, even if they're eventually reimbursed by Intel. Again, that could favor larger producers whose service facilities have the BGA soldering equipment required for the task. Those manufacturers probably have better RMA service anyway, though.
 
If the overclocking is that much better then I might consider that upgrade but hate the fact I'd need a new mobo. I caught hell trying to overclock my 3570k to 4.7ghz
 
Edit - Seems like LGA will still be around for Broadwell in some form, and it is more likely the lower end CPU's going onto BGA.
Yeah, the enthusiast platform will continue to be LGA.

Like today we have Socket 1155 (Sandy/Ivy) and Socket 2011 (Sandy-E/Ivy-E), Broadwell-E will be LGA.
If the overclocking is that much better then I might consider that upgrade but hate the fact I'd need a new mobo. I caught hell trying to overclock my 3570k to 4.7ghz
Delid it. Most should hit 4.8-5.0 once you get that terrible TIM out of there. It takes 30 mins and isn't nearly as scary as it sounds.
 
Yeah, the enthusiast platform will continue to be LGA.

Like today we have Socket 1155 (Sandy/Ivy) and Socket 2011 (Sandy-E/Ivy-E), Broadwell-E will be LGA.

Delid it. Most should hit 4.8-5.0 once you get that terrible TIM out of there. It takes 30 mins and isn't nearly as scary as it sounds.

I might try that. For gaming is there any advantage of the 2011 vs 1155?

Edit:
I have my 3570k at 4.7ghz, the process of getting their was tough. 4.7ghz on my cpu is quite stable but if I could do 4.8ghz that'd be awesome.
 
That's one thing I will never do... delid. People say that like it's nothing. I have shaky hands and I sweat too much to do that crap.

Yeah I'll be waiting for Broadwell and Maxwell to hit before building my monstrosity.


Thinking about getting a GTX 780m and Haswell based laptop though.

http://www.notebookcheck.net/NVIDIA-GeForce-GTX-780M.88993.0.html
http://videocardz.com/39987/nvidia-geforce-gtx-770m-and-780m-spotted-monster-gaming-notebooks

Why do a laptop when you can just hook up a desktop and keep upgrading?
 
I'm seeing a lot of folks say the i7-920 (I have a 930) is good enough to stick around for this gen. I agree. But I'm still getting the mega itch and my motherboard's been acting up. Rather than trying to find an affordable mobo for this processor, I'm thinking I'm going to be a gigantic whore and do an entire rebuild.

Since I have 2GBx6 right now, I'll probably be investing in 8GBx2 or 8GBx4 depending on how big of a whore I want to be, and selling off or giving away the 2GB sticks.

I have a GTX 460, TRYING to hold onto that until the GeForce 700 series comes out. Planning on getting a 770 if it's around that $400 price point, or maybe a 670 when it gets cheaper after the 700s bump it down. We shall see...
 
I wonder if the CPU can use that eDRAM when the GPU is going unused or disabled, that would be very interesting. IBMs highest end CPUs (and with a much smaller local one, the Wii U CPU) have eDRAM caches, that would have interesting performance implications

Looks like only quads will get GT3e. Bummer, I was hoping for MBP sized 13 inch laptops with that GPU for decent integrated gaming.
 
Wait WTF, am I misunderstanding here or are we really getting 3.3GHz quads in ultrabooks??? My current 16-incher tops out at 3.2GHz with quad TB

Edit: Looking a bit closer I see they haven't detailed the CPU's ending with U, which I guess stands for Ultrabook :/
 
Haswell-E might have one at $300 or so.

They don't really need 6 cores though. AMD and Intel have both basically achieved 8 thread CPUs, despite both using 4 cores (4 core pairs in the case of AMD). People who want or need the extra overhead a beefy hex core brings to the table generally need it for enterprise or multimedia creation. A $550 processor isn't really that much more than a $330 processor when you are talking business tools.
 
So why don't review sites just convert it to FPS then? It is easier to understand that a 3570k is good for 54 fps compared to a 920 which is good for 43 fps. What info gets lost when the review site does the calculation compared to when the reader has to do it?

Because it's a useless step. FPS is useful in rendering, how many frames can my rig render in a second. With a game you want a smooth experience and that's why that graph shows what frame time 99% of all frames was rendered in. It's really FPS that's the strange metric.

PCperspective "converted" some graphs over to FPS, but I personally find the graph showing all frame times over the test period much more helpful than a number or a simple graph. Frame time graphs also show big stutters (one frame on screen for 100ms, 0.1second) as the big important thing they are. IMO they just flat out make more sense and give a better view over how the game is displayed.
Crysis3_1920x1080_PER.png
Notice SLI shows a much better average FPS, but have gigantic stutters the first graph masked.
Wait WTF, am I misunderstanding here or are we really getting 3.3GHz quads in ultrabooks??? My current 16-incher tops out at 3.2GHz with quad TB

Edit: Looking a bit closer I see they haven't detailed the CPU's ending with U, which I guess stands for Ultrabook :/

Ultra low wattage, means the TDP (heat) is 17W. 25-45 is normal for laptops.
 
Because it's a useless step. FPS is useful in rendering, how many frames can my rig render in a second. With a game you want a smooth experience and that's why that graph shows what frame time 99% of all frames was rendered in. It's really FPS that's the strange metric.

PCperspective "converted" some graphs over to FPS, but I personally find the graph showing all frame times over the test period much more helpful than a number or a simple graph. Frame time graphs also show big stutters (one frame on screen for 100ms, 0.1second) as the big important thing they are. IMO they just flat out make more sense and give a better view over how the game is displayed.


Notice SLI shows a much better average FPS, but have gigantic stutters the first graph masked.
What he is saying, and what others have said, is show the exact same data, but instead of ms, show each frame as a FPS number to make it easily recognizable.

When you show the speed of a race car going through a speed trap, you are essentially measuring how quickly the car traveled through a given distance. This is then converted to MPH, despite MPH not being the actual measurement being taken.
 
do you have the 900D with you? can you please tell me if there is enough space to do a pull/push configuration with a 280mm Radiator on the top part?
No he doesn't, but the answer to that question will depend on which 280mm rad you are talking about. The final answer will be yes most likely. There is 110mm of clearance, which means a radiator 60mm thick will fit, although probably snugly (25mm per fan).
 
Nvidia GeForce 650 supposedly.

Bearing in mind just gameplay was shown and the 650 was TDP locked at 35W, no framerates were shown. One could be 50 and one could be 90 for all we know and they would look comparable on 60HZ monitors, but one would have significantly more room to grow.

I wonder how much performance that eDRAM will add, and how much the GT3 alone will gain without it. I was hoping for 13" laptops with GT3e, but that doesn't seem like the plan now.
 
I need to buy a new laptop within the next couple of months. It looks like Haswell is design for mobile, so I should probably wait?
 
A for effort though! How many people would actually take the time to do that as opposed to complain on a forum? Respect.

Thanks man.

Wait WTF, am I misunderstanding here or are we really getting 3.3GHz quads in ultrabooks??? My current 16-incher tops out at 3.2GHz with quad TB

Edit: Looking a bit closer I see they haven't detailed the CPU's ending with U, which I guess stands for Ultrabook :/

Considering quad-core (albeit non-hyperthreaded) Atom CPUs aren't too far off, I imagine the same for Ultrabook geared Core ix processors.
 
I have an old Phenom 2 1090T from 2010. Still trying to decide between waiting for this, Steamroller, or getting an i7-3770K.

What's the expected release date for Haswell?
 
Bearing in mind just gameplay was shown and the 650 was TDP locked at 35W, no framerates were shown. One could be 50 and one could be 90 for all we know and they would look comparable on 60HZ monitors, but one would have significantly more room to grow.

I wonder how much performance that eDRAM will add, and how much the GT3 alone will gain without it. I was hoping for 13" laptops with GT3e, but that doesn't seem like the plan now.

Even if the GT3e did outperform the 650 in raw performance I imagine that it would still probably fall behind it when it come to in games performance, Intel drivers are still just very young when compared to AMD/Nvidia drivers.

I have an old Phenom 2 1090T from 2010. Still trying to decide between waiting for this, Steamroller, or getting an i7-3770K.

What's the expected release date for Haswell?

~Month from now.
 
So... is it worth waiting for desktop Haswell? In the process of building a new PC and I can't really do much without getting the processor, haha.
 
Apparently mobile version of haswell only will feature HD4600 which is just tiny bit improvement over HD4000. The real improvement which is HD5200 is desktop only.
Maybe that's just the initial launch?
I was toying with the idea of updating from my MacBook Air 2010 to MacBook Air Q3 2013 but if it's *<HD5xxx then I will wait until 2014 before I upgrade.
PS! Read on Twitter today that Haswell will be the first Intel CPU to include x86 extensions for Hardware Transactional Memory (exciting stuff if you're into multicore programming).
 
So... is it worth waiting for desktop Haswell? In the process of building a new PC and I can't really do much without getting the processor, haha.
Probably not. The big generational changes in interface systems (SATA 3.0, PCI-E 3.0, USB 3.0) already happened with Ivy.
 
Top Bottom