• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

sk3tch

Member

Lookin' good! How was the install?


I just got the signature edition, here's what the extra $50 gets you:

http://i830.photobucket.com/albums/zz227/dbswisha/DSC_0708.jpg
http://i830.photobucket.com/albums/zz227/dbswisha/DSC_0709.jpg <--mousepad

You also get a poster. Other than that stuff the card is the exact same. Shirt is pretty cool though.

Congrats Smokey! 690 is def. the way to go. I was looking at that card on EVGA.com the other day and trying to figure out what that $50 gets you. Not bad. And yeah, I'm the nerd that wears those shirts - at least when I'm kickin' it with my 22 month old son...haha.

What kind of FPS are you getting with BF3 MP? Just eyeball estimates from FRAPS while my 680 2GB quad-SLI is running...but at 5760x1080 with everything maxed (no deferred AA) I get ballpark 70-80 FPS...although it can get as low as 50s at times. Hoping my 4GB upgrade will let me max it all out.


Interested in GAF's opinion on the GTX680s from Zotac - thinking of picking one up from Amazon.

I haven't heard anything bad about Zotac and you know how great Amazon is - so I say buy. If you can, use a CC (such as AMEX) that provides add'l warranty protection for your purchase. Did you check the reviews of the same model on NewEgg to see how people are faring?
 

Lord Panda

The Sea is Always Right
Thanks fellas :) Newegg boffin's are very positive on the card.

Yeah I was thinking of picking up two of the Zotac units for some sweet SLI action until the price just spiked to $549. Oh well that's what I get for dithering ... ;)
 

mxgt

Banned
Just got my 2nd 670

Playing Crysis with high res mods and max settings at 1440p with 60fps solid

/tear. finally.
 

sk3tch

Member
Picking up an EVGA Z77 FTW mobo to try out native PCIe 3.0 with quad-SLI GTX 680. For some reason NVIDIA still hasn't fixed X79 + GTX 680 PCIe 3.0 support. More info here for those curious.

This means I'll be digging out my trusty ol' i5 2500k that OC's to 4.7 GHz and "downgrading" to only 16GB of RAM. Heh. If it turns out as well as advertised in that thread, I'll ditch my 3960x/Rampage IV Extreme/extra RAM and probably save like $1k AND have better performance in games (not that the 3960x was ever a good investment for that). Stupid that this isn't fixed, yet. Not going to waste my 680 4GB upgrade with PCIe 2.0 holding me back.

I tried the hack outlined in that thread - my setup is really unstable with it...so that's why I'm going down the Z77 + quad-SLI path (only three current options: EVGA Z77 FTW, ASUS P8Z77V Premium, and ASUS P8Z77 WS). Z77 is the only "officially" supported PCIe 3.0 platform for NVIDIA currently. EVGA_JacobF stated they're (NVIDIA) still certifying X79...whatever. It has been months!

Nuts because I've got PCIe 3.0 lovin' running great with a 7970 CFX rig!
 

Smokey

Member
Congrats Smokey! 690 is def. the way to go. I was looking at that card on EVGA.com the other day and trying to figure out what that $50 gets you. Not bad. And yeah, I'm the nerd that wears those shirts - at least when I'm kickin' it with my 22 month old son...haha.

What kind of FPS are you getting with BF3 MP? Just eyeball estimates from FRAPS while my 680 2GB quad-SLI is running...but at 5760x1080 with everything maxed (no deferred AA) I get ballpark 70-80 FPS...although it can get as low as 50s at times. Hoping my 4GB upgrade will let me max it all out.

I got a Dell U3011 with the 690 to replace my Asus 3D monitor. At 2560x1600 on the U3011 I get 60fps easy in BF3. Feels like the 690 is yawning at me. I cap it at 60 because well...there's no point in going higher when the monitor is 60hz.
On the Asus I easily get 115-120fps everything max. Again I cap it at 120 because there's no point in going higher for gameplay. Card is stock too...no OC. I feel there really is no need to for gaming.

I wish it had more memory, but it hasn't been a problem for me at 2560x1600. The card is amazing and is hands down an upgrade over my previous 580 SLI setup. Quiter, cooler, and the craftsmanship of the card is second to none. IMO it's probably the best card Nvidia has ever made. The engineering behind it is impressive.

Picking up an EVGA Z77 FTW mobo to try out native PCIe 3.0 with quad-SLI GTX 680. For some reason NVIDIA still hasn't fixed X79 + GTX 680 PCIe 3.0 support. More info here for those curious.

This means I'll be digging out my trusty ol' i5 2500k that OC's to 4.7 GHz and "downgrading" to only 16GB of RAM. Heh. If it turns out as well as advertised in that thread, I'll ditch my 3960x/Rampage IV Extreme/extra RAM and probably save like $1k AND have better performance in games (not that the 3960x was ever a good investment for that). Stupid that this isn't fixed, yet. Not going to waste my 680 4GB upgrade with PCIe 2.0 holding me back.

I tried the hack outlined in that thread - my setup is really unstable with it...so that's why I'm going down the Z77 + quad-SLI path (only three current options: EVGA Z77 FTW, ASUS P8Z77V Premium, and ASUS P8Z77 WS). Z77 is the only "officially" supported PCIe 3.0 platform for NVIDIA currently. EVGA_JacobF stated they're (NVIDIA) still certifying X79...whatever. It has been months!

Nuts because I've got PCIe 3.0 lovin' running great with a 7970 CFX rig!

It also has a small PLX chip on the card itself which allows you to take advantage of PCIE 3.0. Makes this feature useable on X79 boards I believe. Maybe you should look into two 690s ;)

When the GTX 680 was first launched, some assumed that its performance would be somewhat curtailed on anything but a PCI-E 3.0 slot. NVIDIA had other ideas since their post release drivers all dialed its bandwidth back to PCI-E 2.0 when used on X79-based systems. The reasons for this were quite simple: while the interconnects are built into the Sandy Bridge E chips, Intel doesn't officially support PCI-E 3.0 though their architecture. As such, some performance issues arose in rare cases when running two cards or more on some X79 systems.

This new GTX 690 uses an internal PCI-E 3.0 bridge chip which allows it to avoid the aforementioned problems. But with a pair of GK104 cores beating at its heart, bottlenecks could presumably occur with anything less than a full bandwidth PCI-E 3.0 x16 connection. This could cause issues for users of non-native PCI-E 3.0 boards (like P67, Z68 and even X58) that want a significant boost to their graphics but don&#8217;t want to upgrade to Ivy Bridge or Sandy Bridge-E.

In order to test how the GTX 690 reacts to changes in the PCI-E interface, we used our ASUS X79WS board which can switch its primary PCI-E slots between Gen2 and Gen3 through a simple BIOS option. All testing was done at 2560 x 1600 in order to eliminate any CPU bottlenecks.

http://www.hardwarecanucks.com/foru...s/53901-nvidia-geforce-gtx-690-review-25.html
 

sk3tch

Member
It also has a small PLX chip on the card itself which allows you to take advantage of PCIE 3.0. Makes this feature useable on X79 boards I believe. Maybe you should look into two 690s ;)



http://www.hardwarecanucks.com/foru...s/53901-nvidia-geforce-gtx-690-review-25.html

I would but 2x690 is essentially the same as what I have now if I move to IB - and it still has the bandwidth problems if you go quad (i.e. 2x690) - they are speaking of just one card in that quote. Just want to make sure I have a platform that can allow me to fully take advantage of quad-SLI 680 4GB. Plus, I don't want to dump heat into my case. That's why I love the reference design 680s! Although - I do wish the 690 were out earlier, I would have probably gone with two of those and Z77 from the get-go. X79 is a bust with NVIDIA right now.

How is gaming on a 30" panel? Is it a hinderance in MP? I've heard anything over 24" will degrade your reaction due to the amount of eye/head movement to cover your screen.
 

Smokey

Member
How is gaming on a 30" panel? Is it a hinderance in MP? I've heard anything over 24" will degrade your reaction due to the amount of eye/head movement to cover your screen.

It is pretty immersive. The difference in screen size is massive:

DSC_0847.jpg


The statement you said I've also heard with bigger TVs. In the end I am able to adjust to it and play just fine. That being said the 120hz panel is better for MP. Whenever I go back to that monitor I do better, but I still hold my own just fine on my bigger panel.

the 1600p resolution plus screen size makes it preferable to me. Games look amazing. There's pros and cons to both really. That's why I have both side by side for whatever mood I am in ;)
 

sk3tch

Member
Wow...that's tight. Makes me wanna switch. One project at a time, though. :) I'm sure you recognized what I recognize - nothing beats a great panel. This 120hz shit is awesome for MP but other than that - it's so-so. The colors are so washed out/etc. I had an Alienware M18x R2 with the 18" glossy Samsung panel - thing was ridiculous. So nice. Made me realize how bad the colors and detail are on these BenQ 120hz (XL2420/etc.) are. It was dual 7970M and the drivers were horrid so I sent it back and I'm waiting for 680M to drop in a week or two.

Just bought an i5 3570k...figure I will go "all out" on Ivy Bridge. $220...lol. What a deal.

If anyone wants a sweet X79 setup I'll sell it to ya for cheap...haha.
 

scogoth

Member
Wow...that's tight. Makes me wanna switch. One project at a time, though. :) I'm sure you recognized what I recognize - nothing beats a great panel. This 120hz shit is awesome for MP but other than that - it's so-so. The colors are so washed out/etc. I had an Alienware M18x R2 with the 18" glossy Samsung panel - thing was ridiculous. So nice. Made me realize how bad the colors and detail are on these BenQ 120hz (XL2420/etc.) are. It was dual 7970M and the drivers were horrid so I sent it back and I'm waiting for 680M to drop in a week or two.

Just bought an i5 3570k...figure I will go "all out" on Ivy Bridge. $220...lol. What a deal.

If anyone wants a sweet X79 setup I'll sell it to ya for cheap...haha.

*cough* *cough* >_>

So how bout that cross border shipping?
 

Smokey

Member
Wow...that's tight. Makes me wanna switch. One project at a time, though. :) I'm sure you recognized what I recognize - nothing beats a great panel. This 120hz shit is awesome for MP but other than that - it's so-so. The colors are so washed out/etc. I had an Alienware M18x R2 with the 18" glossy Samsung panel - thing was ridiculous. So nice. Made me realize how bad the colors and detail are on these BenQ 120hz (XL2420/etc.) are. It was dual 7970M and the drivers were horrid so I sent it back and I'm waiting for 680M to drop in a week or two.

Just bought an i5 3570k...figure I will go "all out" on Ivy Bridge. $220...lol. What a deal.

If anyone wants a sweet X79 setup I'll sell it to ya for cheap...haha.


Yeah the colors are amazing. Even general browsing they pop out. Since I have the two monitors directly next to each other it's fairly easy to compare them and it's no contest. The resolution really does wonders man...games look crazy good.

I'll also say that since I've been using this monitor for a few weeks I am used to 60hz again, and I'm now back to feasting in BF3 :p. BTW...I notice you got a 3570k...what's up with the 3770k? Sold out everywhere and has been for a really long time with no replenishment. It's weird.
 

Gaogaogao

Member
any clues as to if the gtx 660 will be stronger or weaker than the gtx 580? Cause im looking at a good deal on a gtx 580 right now.
 

MrBig

Member
any clues as to if the gtx 660 will be stronger or weaker than the gtx 580? Cause im looking at a good deal on a gtx 580 right now.

If the earlier spec leak was accurate it will be around the same performance with 1.5gb vram, just a tad better. And use much less power.
 

Binabik15

Member
Damn, AMD is pissing me off right now, seriously. Asking me every other day to update Catalyst 12.3 and no matter how I install it or sweep my old version it will upgrade to 12.3 with the 12.4 download, then tell me to upgrade it. Urgh. I wanted to wait untill the Radeon 8000er series at least to upgrade my 5850, but right Arkham City, BF3 and SR3 brutalize it pretty hard. SR3 had a performance problem with AMD, doesn't it?

So, looking at stores I could get reference 670s at ~350€, a Gigabyte GeForce GTX 670 OC for 380€, but 380€ is also what reference 7970s start at. The fancier cooler/OC models are as expensive or more expensive than reference 7970s. Now I've gathered that 7970s are roaring monsters and, without OC, 670s match them in quite a few games, but I want tips and impressions from more knowledgable and experienced owners.

I have a i5 750 with only a small OC right now, but with an aftermarket cooler that could probably change, w7 64 bit and a good 550 Watt PSU, that shows it age by not having a dedicated 8 pin plug(?)/cord/whatever, though.

So, enlighten me, please. I only play at 1080p, so performance from both should be more than plenty, but buying a sligthly better than the 5850 card now and then again soon doesn't sound so hot, especially since I don't know who I could sell my old card(s) to. Thats why I want(ed) to wait with upgrading, but new stuff is shiny and I'm open minded right now. Though I'll have to wait till August to spend a lot of time tinkering around, so that might be close to AMD refreshes, argh.
 

sk3tch

Member
Yeah the colors are amazing. Even general browsing they pop out. Since I have the two monitors directly next to each other it's fairly easy to compare them and it's no contest. The resolution really does wonders man...games look crazy good.

I'll also say that since I've been using this monitor for a few weeks I am used to 60hz again, and I'm now back to feasting in BF3 :p. BTW...I notice you got a 3570k...what's up with the 3770k? Sold out everywhere and has been for a really long time with no replenishment. It's weird.

Yeah I actually bought both. Managed to snag a 3770k when Amazon replenished on Monday. Probably going to keep the 3770k...although I'm not a fan of the "waste" it is for gaming. :)

I'll do my 60hz gaming on my M18x R2. Maybe I'll inch back to great colors and fidelity some day. I just need any edge I can get in BF3 MP - I suck! :)
 
So is the 670 the best value for money, or is the 680 a lot better?

I'll be building a new PC in a few months, and wondering which one I should be aiming for.
 

sk3tch

Member
670. Performnce difference is minimal with an OC. An OCed 670 can outperform a 680.

Or buy one of my EVGA GTX 680 2GB for $450 in the b/s/t thread. :)

680 is still the boss. But the value is definitely on the 670s side. No OC can overcome the add'l Cuda cores the 680 has.
 

MrBig

Member
Or buy one of my EVGA GTX 680 2GB for $450 in the b/s/t thread. :)

680 is still the boss. But the value is definitely on the 670s side. No OC can overcome the add'l Cuda cores the 680 has.

Yeah that's a great deal you have but shipping to Aus wont be I'd guess

And yes, I just mean ref. Once you start to OC a 680 theres no way a 670 can keep up.
 

Jtrizzy

Member
Damn this whole monitor thing is killing me here, and this thread isn't helping lol. I know it's ot, but this is the best place to ask for help deciding what my next project should be:

-I do all of my pc gaming at 1080p60 on a VT25 plasma.

-I need a pc monitor, but am concerned about the washed out colors in the Asus, especially coming from my plasma.

-I've never seen resolutions higher than 1080p, and I feel like I'd rather shoot for 1080p60 in 3d over higher resolutions. I don't really care for the 720p 3d that my plasma is limited to.

-I'm interested in a tri monitor set up, but again am worried by the colors of the Asus.
 

Smokey

Member
Any case anyone missed it, Beta drivers;

http://www.guru3d.com/news/nvidia-geforce-30448-drivers-download/

- Key Fixes
Fixes an intermittent vsync stuttering issue with GeForce GTX 600-series GPUs.
Fixes an issue where some manufacturer’s factory overclocked cards default to and run at lower clocks.
Fixes a performance issue in Total War: Shogun 2 with the latest game patch.

Fucking finally. Glad I'll be able to play with vsync on again
 

Rufus

Member
Down to $430 now.

7970 Launch: $549
Post 680 Launch: $479
7970GE Launch: $430
Well, never mind then!

Still mystified why they would bother for not even a hand full of frames... Unless it's purely to get people talking about AMD again. Which it probably is. Nevermind me, I need to go to bed.
 

mkenyon

Banned
Isn't the GE just an OC, not a hardware mod? The 670 is $30 less than it, $400.
Yep. They're just putting the performance everyone is getting by moving the sliders partially to the right. People have been saying the 680 is king, this was their way of getting people to talk about the 7970 again on equal footing. It *always* was, but it was from the OC headroom. I'm not even talking about the "turn fan to 100% and see max mhz on air OC room" like people are when they are talking about Kepler. There is a ridiculous amount of headroom on the 7970s.
 
Isn't the GE just an OC, not a hardware mod? The 670 is $30 less than it, $400.

7970 looks good in those GE runs but what has me super concerned is the power usage and fan noise. I've read the few reviews out and although it trades blows and beats the 680 on various tests, both the aforementioned items go WAY high up. FWIW.
 
7970 looks good in those GE runs but what has me super concerned is the power usage and fan noise. I've read the few reviews out and although it trades blows and beats the 680 on various tests, both the aforementioned items go WAY high up. FWIW.

Way up is 20-40 watts and 5db?

Of course, it's just an overclocked 7970 and the same would happen if you overclocked the 680 or any other card on the market. At this level of hardware power usage isn't really a big deal and most that spend this much will have some form of aftermarket cooling or buy a model with a decent cooler on it.

With Sea islands ready for production AMD might as well get as much as they from the 7 series knowing all nVIdia can really do at this point(apparently) is pile on multiple GPUs onto a single card to take the performance crown.
 

MrBig

Member
Way up is 20-40 watts and 5db?

Of course, it's just an overclocked 7970 and the same would happen if you overclocked the 680 or any other card on the market. At this level of hardware power usage isn't really a big deal and most that spend this much will have some form of aftermarket cooling or buy a model with a decent cooler on it.

With Sea islands ready for production AMD might as well get as much as they from the 7 series knowing all nVIdia can really do at this point(apparently) is pile on multiple GPUs onto a single card to take the performance crown.

Reportedly they have a Geforce version of the GK110 waiting for production if AMD out's anything new so that they can stay competitive.
 
Top Bottom