• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

Kepler is beating the Radeon HD 7970 with a narrow memory bus. When Nvidia lets the Kepler with 512-bit memory bus out of the cage, it's going to be a slaughter.

I still can't max out GTX 470 SLI in many of my games using Core i7-950 (Bloomfield) OCed to 4 ghz. I would need a hugely OCed Ivy Bridge just to keep my GTX 470 SLI cards busy, so I can't imagine what kind of CPU you need for Kepler SLI to keep those beasts fed.
 
Kepler is beating the Radeon HD 7970 with a narrow memory bus. When Nvidia lets the Kepler with 512-bit memory bus out of the cage, it's going to be a slaughter.

I still can't max out GTX 470 SLI in many of my games using Core i7-950 (Bloomfield) OCed to 4 ghz. I would need a hugely OCed Ivy Bridge just to keep my GTX 470 SLI cards busy, so I can't imagine what kind of CPU you need for Kepler SLI to keep those beasts fed.

Nah. OC'd 2500k @ 4.5 Ghz is more than enough for 470 SLI, 580 SLI, or even 680 SLI. Hell, 5 Ghz 2500k is probably the max needed for 680 SLI. CPU won't bottleneck, games just aren't pushing a 470 SLI setup enough.
 

dr_rus

Member
How did Nvidia manage to triple core count in such a small die? They ditched the hotclocks! Yes, probably half of the 3d geek world already knows this answer (hello dr_rus) but what else did Nvidia change?
Hi artist. Yeah, I knew all of this. That's why I've said "one of the reasons" to BoobPhysics. 680 sitting in my case right now says hi to you too. A nice card but as I've said earlier it should cost $400 at max, not $500.
 

artist

Banned
Who knows. I think the chip being launched as the 680 was originally slated to be the 660.
6 SMX
1152 SPs
96 TMUs
24 ROPs
6 Polymorph engines
192-bit 6Gbps GDDR5
May
$299


Hi artist. Yeah, I knew all of this. That's why I've said "one of the reasons" to BoobPhysics. 680 sitting in my case right now says hi to you too. A nice card but as I've said earlier it should cost $400 at max, not $500.
Wiggle room. Pics?
 

dsk1210

Member
They have added adaptive vertical sync, I asked andrew from nvidia about that ages ago, so glad to see it has been added.

Rock solid 60 vertical sync on, the minute it drops below the v-sync will drop off allowing the frame rate to stay higher but introducing screen tear, a great option to have for heavy performance gaming.

Bayonetta on the 360 used this smart v-sync and kept the game nice and fluid for the most part.

Along with the new AA method I am keenly anticipating the 685/690



Dave
 
They have added adaptive vertical sync, I asked andrew from nvidia about that ages ago, so glad to see it has been added.

Rock solid 60 vertical sync on, the minute it drops below the v-sync will drop off allowing the frame rate to stay higher but introducing screen tear, a great option to have for heavy performance gaming.

Bayonetta on the 360 used this smart v-sync and kept the game nice and fluid for the most part.



Dave

Oh sweet. Do you think they'll be allowing this for the GTX5xx series through new drivers?
 

Sethos

Banned
Man, this is why I love Nvidia - Even their drivers is something you can look forward to.

Still torn on whether to upgrade. My 580 is a great card but honestly, it's struggling at 2560x1600 with just a bit of AA. However, I know I'll regret it when the GK110 is out in a few months, plus the 2GB is really putting me off.

Although, I'm going SLI, so it doesn't have to be top of the line either .... Decisions.
 

dsk1210

Member
how do I use a modified nvidia.inf file? never had to do that before but will not let me install unless I use the modded file.

Ignore this, just copy the modified ini file into the extracted folder with the display properties folder

Dave
 

dsk1210

Member
Just tested Alan Wake at 1080p maxed and the adaptive v-sync is working fantastic, no judder and just a slight amount of tearing when things get hectic. Very pleased. :)


Dave
 

1-D_FTW

Member
Wow. Didn't know the bit about adaptive v-sync. I'll have to try it.

Minimums in general ( except for Battlefield 3 ) are better for HD 7970 which i consider a better architecture with more potential and much more elegant. This plus long idle tdp advantage that also is important if your leave your rig on at nights...

At the risk of sounding like a jerk, if 10 - 15 watts is a huge deal to you, why don't you just put your computer to sleep at night and save real energy?
 

Hazaro

relies on auto-aim
Looks pretty decent. Some good coolers coming under $550 is ok.
See where the OC potential lies and how min frames hold up. Pretty nice for the 660Ti.
Too bad AMD isn't being competitive, this should have been $400.
ATI need to drop their prices, 7800 series is hugely over priced
It is, but compared to what competition?
 

artist

Banned
OP is updated.

geforce-gtx-680-style2-1000x580.png
 
Looks pretty decent. Some good coolers coming under $550 is ok.
See where the OC potential lies and how min frames hold up. Pretty nice for the 660Ti.
Too bad AMD isn't being competitive, this should have been $400.

It is, but compared to what competition?


Well, it's not that huge a leap to the 680.


Also, eh. If you're referring to the 680 as the 660Ti, what is Nvidia actually going to be selling as the 660?
 
Wow. Didn't know the bit about adaptive v-sync. I'll have to try it.



At the risk of sounding like a jerk, if 10 - 15 watts is a huge deal to you, why don't you just put your computer to sleep at night and save real energy?

Is a huge deal for the life of the GPU.
 

1-D_FTW

Member
Well, it's not that huge a leap to the 680.


Also, eh. If you're referring to the 680 as the 660Ti, what is Nvidia actually going to be selling as the 660?

Hopefully something that's been intentionally gimped with clock speeds and can be easily over-ridden to get a lot of the performance back.

Is a huge deal for the life of the GPU.

My GTX 460 barely runs much over ambient temps when it's sitting at long idle. I don't think 15 - 20 watts (with a fan running on it) is going to affect life one bit. It's nice and all that AMD is putting a zero state in for when the monitor is off and the computer is running, but if anyone is truly interested in saving electricity, the sleep button on the keyboard is a whole lot better. Takes 3 second to come out of sleep and according to my Kill-a-watt, my entire system uses 1 - 2 watts while sleeping.
 

dark10x

Digital Foundry pixel pusher
Performance looks nice, but I'm kind of glad it isn't a massive leap. I purchased a GTX580 on the cheap from a GAF member earlier this year and was kind of hoping the 680 wouldn't completely obliterate it yet. :p OK, that's kind of silly, but hey.

I really need to upgrade my CPU, though, as it seems the good ol' i7 930 just can't keep up with the 580 in some cases. I'm finding some games are CPU limited with drops under 60 fps occurring regardless of graphical settings.
 

Geezer

Broken water pistol loaded with piss
6 SMX
1152 SPs
96 TMUs
24 ROPs
6 Polymorph engines
192-bit 6Gbps GDDR5
May
$299

Out of interest,is this based on GK104 with parts disabled or another smaller die?
Semiaccurate said that there will be 7 and 8 part GK104s, which correspond to 1344SP (maybe a future GTX 670) and 1536SP (GTX 680) variants.
 

confused

Banned
So, can this card finally run Crysis 1 maxed at 60+ fps @ 1080P or can we finally admit the game was just badly optimised/terribly supported ?
 

artist

Banned
Out of interest,is this based on GK104 with parts disabled or another smaller die?
Semiaccurate said that there will be 7 and 8 part GK104s, which correspond to 1344SP (maybe a future GTX 670) and 1536SP (GTX 680) variants.
Yes. Unconfirmed so dont put any stock into it.
 

rabhw

Member
So if GK104 = GTX680, GK110 = GTX690? Does that mean the 690 will be a dual-GPU part like the 590? Haven't been keeping up with this stuff.

I want to hop off the SLI / CrossfireX train, so I'll be selling my dual 6950's to buy one of these. Where do people reliably and un-sketchily sell video cards?
 
Nvidia GPUs are better than AMD GPUs?

What a huge surprise.

I don't have a personal need for a 600 series yet, but I sure as hell am happy with progress.
 

ss_lemonade

Member
Those benchmarks make me happy and sad at the same time lol. Happy because of the good performance increases, and sad because I don't see my card in any of those benches lol.

I have a 6970 and a 1920x1080 monitor. You guys think its worth it to do the jump? The only really heavy games I have are Battlefield, Crysis and Witcher 2 (uber).
 
Is a huge deal for the life of the GPU.

Surely my GTX 470 would be very very dead already if it mattered ;)

Are there any rumors about 670 release date ? I'd love to upgrade but i always prefer to get slightly cut down versions of top card as they have better value/performance ratio.
 

scogoth

Member
Performance looks nice, but I'm kind of glad it isn't a massive leap. I purchased a GTX580 on the cheap from a GAF member earlier this year and was kind of hoping the 680 wouldn't completely obliterate it yet. :p OK, that's kind of silly, but hey.

I really need to upgrade my CPU, though, as it seems the good ol' i7 930 just can't keep up with the 580 in some cases. I'm finding some games are CPU limited with drops under 60 fps occurring regardless of graphical settings.

No game should you be CPU limited with a i7 930. Are you running stock clocks?
 
I have a AMD HD 5850.

I know 500 is a lot of money but man I think it would be a great upgrade.
But it depends on what you are actually playing. Given the prices and my gaming diet beginning to think i could easily hold my self for some months until new things or prices become more reasonable. Plus the 2GB is a turn off, a game like Skyrim gets close to 2GB at 1080p plus official texture pack and running with 4x AA.

Really impressive but who overclocks better at stock voltage? That could be a decisive factor.
So, can this card finally run Crysis 1 maxed at 60+ fps @ 1080P or can we finally admit the game was just badly optimised/terribly supported ?
Or you could get the other pair of glasses and view it this way, there were also a lot of games that didn't look close to Crysis yet ran worse.
 
But it depends on what you are actually playing. Given the prices and my gaming diet beginning to think i could easily hold my self for some months.

Really impressive but who overclocks better at stock voltage? That could be a decisive factor.
Building a new rig as soon as ivybrige is out so I can wait at least until that is out :)

I do do gaming at 1920x1080 on my tv with the 360 controller, most of the time.
And most of the time not full all high but mostly high settings.
 

scogoth

Member
Building a new rig as soon as ivybrige is out so I can wait at least until that is out :)

I do do gaming at 1920x1080 on my tv with the 360 controller, most of the time.
And most of the time not full all high but mostly high settings.

So why do you need $500gpu and new CPU to game at 60fps at 1080p with a controller with noticeable input lag?
 
So why do you need $500gpu and new CPU to game at 60fps at 1080p with a controller with noticeable input lag?
Yes, and i'll add that depending from what CPU The Dutch Slayer is upgrading and what games he typically uses he could still get slight fps increases because of the better cpu architecture. Also Dutch try to get a GPU with 3GB, even if you game at 1080p i gave that Skyrim example were 2GB gets maximized depending on your settings.

Maybe some months from now there will be a 680 3GB edition at the same price.
 
Top Bottom