• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel Ivy Bridge Reviews & Info — CPUs, Motherboards, Sandy Bridge Compatibility

Monarch

Banned
Except if the Tweaktown review is any indication, not really. There were some benchs where the 2500k was a frame or two quicker. It was pretty much a dead heat. And it was only 2 or 3 watts more efficient under load and idle than the 2500k. It's a COMPLETE disappointment if you're expecting anything over the 2500k in terms of performance/efficiency.

What are you talking about ? In almost every benchs in the TD review, we see the 3570K ahead of the 2500K (even 2700K in certain cases) for less energy consumed. Sure it's maybe 5-10% more performance but for a guy like me who is building a rig for the first time I don't see the benefit of choosing a 2500K over a 3570K, apart from the fact that IB may end up bad overclockers.
 

cilonen

Member
Yup, still happy I didn't wait and went for a 2600k. Hopefully the next 'tick' (or is it 'tock'?) wrings out some more performance.
 

The Boat

Member
I am so behind the times when it comes to computer hardware in general... I need to find some time to read up on things.
 

theRizzle

Member
No it isn't, http://en.wikipedia.org/wiki/Intel_Tick-Tock

IB is shrunk to 22nm, which means it's a Tick

Ya, I see that, but just because it's on Wikipedia doesn't necessarily make it accurate.

I could have sworn I saw a slide from an Intel presentation that had it listed as a Tock+, but I can't find it right now.

Either way, it seems that this is some kind of weird in-between chip that for some reason doesn't fit into their typical "Tick-Tock" structure.
 

surly

Banned
Ya, I see that, but just because it's on Wikipedia doesn't necessarily make it accurate.

I could have sworn I saw a slide from an Intel presentation that had it listed as a Tock+, but I can't find it right now.
It's tick+.

KBX0n.jpg
 

zoku88

Member
Ya, I see that, but just because it's on Wikipedia doesn't necessarily make it accurate.

I could have sworn I saw a slide from an Intel presentation that had it listed as a Tock+, but I can't find it right now.

Either way, it seems that this is some kind of weird in-between chip that for some reason doesn't fit into their typical "Tick-Tock" structure.

I think they called it a Tick+ or a Tock-ish Tick.

Because they have some architecture changes along with the shrink.
 

theRizzle

Member
It's tick+.

KBX0n.jpg


Ya, that's probably the one I saw, but misread it. Thanks.

Regardless, from those early previews this seems like it might be a bit of a letdown. Gonna be building probably in June and was hoping for better things from IB.
 

1-D_FTW

Member
What are you talking about ? In almost every benchs in the TD review, we see the 3570K ahead of the 2500K (even 2700K in certain cases) for less energy consumed. Sure it's maybe 5-10% more performance but for a guy like me who is building a rig for the first time I don't see the benefit of choosing a 2500K over a 3570K, apart from the fact that IB may end up bad overclockers.

The only things I give a hoot about. Gaming and power. I don't care about synthetic benchmarks for the irrelevant. It slightly nudged the 2500k in 3D Mark 2011, but was one frame slower in AVP. I'd call it a draw if ever there was one.

As for power it was 88/303 vs 90/305. So it was a whopping two watts more efficient during idle and load.

That's a pile of you know what for something that's a new die process and had all the rhetoric about 3D transistors.

EDIT: I'm not telling you to choose the 2500k over the 3570, just that's it's an absolutely worthless update from a performance POV. It's basically all about the IGP and notebooks it seems.
 
So it's basically a wash, with less overclocking headroom? How disappointing. Will SB chips still be available for a while or are they going to aggressively clear them out?
 

Karak

Member
Question about this. I have an i7920 oc'ed to 4.1. I am super happy with it. I am having a hard time identifying if this kind of thing would be a real upgrade or not.

Any help?
 

Hazaro

relies on auto-aim
So it's basically a wash, with less overclocking headroom? How disappointing. Will SB chips still be available for a while or are they going to aggressively clear them out?
So far it seems that way. We'll see how mass retail chips fare, almost always better.
so if i want an integrated gfx laptop, is it time to choose intel over amd yet?
I think HD4000 and AMDs are about the same with Intel still a tad slower. Both are fine for integrated and Intel is still a faster CPU iirc.
Question about this. I have an i7920 oc'ed to 4.1. I am super happy with it. I am having a hard time identifying if this kind of thing would be a real upgrade or not.

Any help?
Not worth the upgrade.
 

Monarch

Banned
The only things I give a hoot about. Gaming and power. I don't care about synthetic benchmarks for the irrelevant. It slightly nudged the 2500k in 3D Mark 2011, but was one frame slower in AVP. I'd call it a draw if ever there was one.

As for power it was 88/303 vs 90/305. So it was a whopping two watts more efficient during idle and load.

That's a pile of you know what for something that's a new die process and had all the rhetoric about 3D transistors.

EDIT: I'm not telling you to choose the 2500k over the 3570, just that's it's an absolutely worthless update from a performance POV. It's basically all about the IGP and notebooks it seems.

I'm not sure it's relevant to judge Ivy Bridge CPUs performance in games based on a whopping one frame difference. Anyway, I agree with you, when it comes to upgrading it's not worth it if you're on SB. But let's see some reviews first, on retail chips, before making any final conclusion.
 

1-D_FTW

Member
I'm not sure it's relevant to judge Ivy Bridge CPUs performance in games based on a whopping one frame difference. Anyway, I agree with you, when it comes to upgrading it's not worth it if you're on SB. But let's see some reviews first, on retail chips, before making any final conclusion.

I'm not sure what we're really arguing. It's not like Ivy Bridge is a different architecture. It's not going to perform worse than the thing it's replacing. It's just a colossal disappointment if anyone was holding off on a 2500k purchase for the past 6 months thinking it was going to give the 10 - 15 percent performance boost, 20 percent power reduction that was being claimed (due to die shrink and 3D transistors).
 

Monarch

Banned
I'm not sure what we're really arguing. It's not like Ivy Bridge is a different architecture. It's not going to perform worse than the thing it's replacing. It's just a colossal disappointment if anyone was holding off on a 2500k purchase for the past 6 months thinking it was going to give the 10 - 15 percent performance boost, 20 percent power reduction that was being claimed (due to die shrink and 3D transistors).

I'm in this position (all of my parts are ready minus the CPU) and I really hope IB won't be what it is now, i.e not a very good performer compared to SB.
 

chaosblade

Unconfirmed Member
I have a core i5 750 its worth upgrade to 3770??

I'd wait until next time, there should be some actual performance improvements with Haswell.

Surprised at all the questions. IB was never about a big performance improvement in the first place, it's a die shrink(+trigate) and it was known that clock for clock would not significantly change months ago. The poorer OCing and worse-than-expected power consumption are the real disappointments here IMO.
 
So me buying a 2500k in November wasn't a mistake! I was really worried I was jumping the gun by not waiting for Ivy Bridge.

What's kind of funny is the same model 2500k is now $15 more expensive on Newegg than when I bought it 5 months ago...
 

LCGeek

formerly sane
I am also one of the many who has been waiting with their q6600, cant wait to build my new rig this witner.

I have two core 2 duo machines I've been looking to upgrade. With the new ati and nvidia cards plus this I can figure out something I really want to build to push my low end stuff up to my more modern hardware.
 

hwalker84

Member
I think later this year i'll still go with the Intel Core i7 3960X unless an extreme edition Ivy Bridge chip hits. I can get 50% off of Intel components.
 

Jomjom

Banned
Damn it looks like the 2500K won't be dropping much in price due to the IB chips. There will still be a heavy demand for the 2500K with improvements that are so minimal.
 
I don't know, we're getting to the point where you can play games at decent resolutions and settings at playable framerates on IGPs.

I guess you could blame that on consoles lowering the bar for PC game requirements and it will all be moot when next-gen starts and requirements see a spike, but who knows?

Yeah, I agree. Though while Intel's GPUs have made huge leaps with Sandy and Ivy Bridge, they're still in line with Intel's GPU slogan of "We suck, just enough." Heck, look at the news stories that came out last week saying that each Haswell GPU will include 64MB of VRAM. While this is great, and will obviously lead to the reported huge performance gains, 64MB is really not enough for serious gaming.

Personally I'm hoping that combined with the use of DDR4 system RAM running at the same speed as the processor, any downside of only having 64MB of VRAM will be minimized.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
·feist·;37021186 said:
Ivy Bridge launch is going to be a hell of a thing. So many folks on a collision course with reality.

Not really for Desktops. Performance is barely faster than Sandy which came out over a year ago. The next design from Intel is next year. Ivy is mostly reduced power for mobile devices.

You can call it tick+ or whatever, it doesn't matter. It's basically sandy bridge.

Haswell next year will have a new CPU cache design and other improvements.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
Worthwhile upgrade over the Q6600?

If things are slow now, sure. But you don't really need to wait for Ivy. Sandy gets 98% the performance in theory, but with games it's going to be nearly impossible to tell the difference because most games are GPU limited.


I'm not sure what we're really arguing. It's not like Ivy Bridge is a different architecture. It's not going to perform worse than the thing it's replacing. It's just a colossal disappointment if anyone was holding off on a 2500k purchase for the past 6 months thinking it was going to give the 10 - 15 percent performance boost, 20 percent power reduction that was being claimed (due to die shrink and 3D transistors).


And to these people I recommend just going ahead and getting Sandy. Why wait another 3 months for availability and weird issues to be worked out like Sandy had.

With Sandy you can tell which motherboards are reliable, order it now, and find OC settings online. Won't have to worry about firmware updates, etc.
 

Quadratic

Member
Sticking with my i7 920. I'll spend the money I was saving for IB on a GTX 680 4GB model or a similar non-reference design. Should tie me over for 2 more years I hope.
 

clav

Member
Still a LGA775 user here.

Things will be interesting once ARM processors go in the regular computing space.

No, you won't be able to play games, but for daily tasks, that may radically change the market.
 
Sticking with my i7 920. I'll spend the money I was saving for IB on a GTX 680 GB model or a similar non-reference design. Should tie me over for 2 more years I hope.

Exactly what I was thinking, they both cost roughly the same but I'm sure to receive a massive boost in performance upgrading from my 5770, compared to upgrading my cpu to ivy bridge.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
looks like Nvidia was right. We're hitting a wall with die shrinks as far as cost reduction and efficiency go.

Eh, that's a different problem.

Intel has had this 2 year cycle:
1. New design, huge boost in performance
2. Die shrink, very small increase in performance

For several years now. Anyone that pays attention to the market (not attempting to be condescending, no shame in not being obsessed with this stuff outside of work) knows that Ivy was going to be Sandy + Die shrink + integrated GPU upgrade.

The die shrink's room for more transistors all went into the integrated GPU.

One of the "improvements" with ivy is DDR3-1600 MHz...but if you OC Sandy you can already get 2133 MHz I think so it's not really better.

And Sandy Bridge was a massive improvement. I'll be thrilled if Haswell is as big of an upgrade. But really the future of computing is going to have to incorporate dozens of CPU cores + GPU design, or something that can do both.
 

_woLf

Member
Reading into these super high temps the 3770K is getting....97*C with an H100? Are you kidding me?

edit: it seems to me like the 3550K can OC way easier than the 3770K, with much less problem with overheating. How weird...
 

kvn

Member
I'm going to upgrade. Still riding a Core 2 Duo E6400 and I guess that it's time to finally upgrade.
 

1-D_FTW

Member
According to whom? Tweaktown didn't get that but maybe they were an anomaly.

When is the official embargo up? Don't want to make a decision just on one set of benches.

The i7 did have an improvement close to that under load. It's the i5 that was the total bust.
 
I want to see benchmarks first before deciding. If you guys were me and had a choice, would you add another gtx 680 for SLI or instead of doing that, upgrade from an I7 975 Extreme to Ivy Bridge?
 
Top Bottom