• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

Hawk269

Member
Some minor new info from the HardOCP forums:



2012-03-15_135626-1.jpg


Source: http://hardforum.com/showpost.php?p=1038505114&postcount=730

Sadface...I don't want to wait 8+ days for the official reveal...that means it's probably two weeks until we can order two. :)

The way that reads, he is saying that with one card, using heaven benchmark he got 29fps at full settings and in surround???

If so, that is really impressive considering it was running in surround. I am curious how this puppy will handle Witcher 2 in Uber Sampling at 1080p on a single card?

Also wondering what 2 of these in SLI are capable of? Of course that is if drivers are up to snuff and they got some good SLI profiles which concerns me the most.
 

elty

Member
According to that HKEPC review, they expect AMD to cut the 7970 price by $200.

$200 HKD = ~$25 USD. So the 680 will probably be $549 just like the rumor suggested while AMD drop the price by 5%.
 

sk3tch

Member
According to that HKEPC review, they expect AMD to cut the 7970 price by $200.

$200 HKD = ~$25 USD. So the 680 will probably be $549 just like the rumor suggested while AMD drop the price by 5%.

That would make sense. If you stroll around NewEgg today you will see 7970s at lower prices than ever before. It seems like everyone is prepping...
 

sk3tch

Member
I'm not sure what everyone was expecting.

GTX 480 -> GTX 580 = ~20% faster
GTX 580 -> GTX 680 = ~20% faster

Prices have always been around $500-$600 for the top-end card.
 
I'm still on the power-gobbling, heat-spewing GTX 480 right now but its performance is still keeping me happy (outside of tessellation, but really, fuck tessellation right now). I'll wait until the GTX 7xx to upgrade I think.
 

tokkun

Member
I'm not sure what everyone was expecting.

GTX 480 -> GTX 580 = ~20% faster
GTX 580 -> GTX 680 = ~20% faster

Prices have always been around $500-$600 for the top-end card.

How about the generation before that, where the GTX460 came out for $230 and was better than a GTX285?
 

artist

Banned
I'm not sure what everyone was expecting.

GTX 480 -> GTX 580 = ~20% faster
GTX 580 -> GTX 680 = ~20% faster

Prices have always been around $500-$600 for the top-end card.
Where was this reasoning when the 7970 came out and was faster than the 580 by 20-30%?
 
I'm not sure what everyone was expecting.

GTX 480 -> GTX 580 = ~20% faster
GTX 580 -> GTX 680 = ~20% faster

Prices have always been around $500-$600 for the top-end card.



All of these comparions are flawed. Your one is because the transition from the GTX4xx to the GTX5xx cards didn't have a new manufacturing process (i.e. both 40nm cards, just like AMDs Radeon 5xxx and 6xxx series), while the 6xx cards do have a new manufacturing process and therefore have vastly more potential for more performance.
Other ones are because the GTX680 is not Nvidias high end chip. The high end chip will arrive later and be the ~GTX690 or - if it's arriving even later - might even be a GTX7xx card. I.e. the situation is sort of like the comparison between the HD58xx cards and the HD68xx cards (older high end chip versus newer performance chip).



Edit:
When can we expect a successor to the 560 ti to be released?


GTX680 and 670 are, pretty much, the successors to the GTX560 (both performance chips). It'll take quite some time until prices come down though apparently :/
 
talk to me when ati have the equivalent to 3d vision.

3d gaming is the best gaming.

Have you ever heard of HD3D? There are fewer games with native support compared to 3d vision, but with 3rd party products such as TriDef3D most newer games can be rendered in 3d on both brands.
 

sk3tch

Member
All of these comparions are flawed. Your one is because the transition from the GTX4xx to the GTX5xx cards didn't have a new manufacturing process (i.e. both 40nm cards, just like AMDs Radeon 5xxx and 6xxx series), while the 6xx cards do have a new manufacturing process and therefore have vastly more potential for more performance.
Other ones are because the GTX680 is not Nvidias high end chip. The high end chip will arrive later and be the ~GTX690 or - if it's arriving even later - might even be a GTX7xx card. I.e. the situation is sort of like the comparison between the HD58xx cards and the HD68xx cards (older high end chip versus newer performance chip).

Everyone knows the GTX 680 is the GK104 and apparently a GK110 is coming...but that doesn't change how the card is being positioned and where it lands in the performance/$ - similar to past generations. A new manufacturing process does not always guarantee max performance, there are other benefits too - like better performance/watt i.e. less power consumption.

My point was simplistic, to be sure - but there's nothing "new" going on here. At $550 these GTX 680s will be impossible to find for the first few weeks.
 
Everyone knows the GTX 680 is the GK104 and apparently a GK110 is coming...but that doesn't change how the card is being positioned and where it lands in the performance/$ - similar to past generations. A new manufacturing process does not always guarantee max performance, there are other benefits too - like better performance/watt i.e. less power consumption.

My point was simplistic, to be sure - but there's nothing "new" going on here. At $550 these GTX 680s will be impossible to find for the first few weeks.


That doesn't mean it isn't still bullshit.
 

dr_rus

Member
I wonder if it supports two monitors.
Every GeForce since 2MX supports two monitors. GK104 supports up to four.

TXAA looks intriguing, wonder if that's what makes it possibly to go head to head with a 384 bit card - normally, I would expect the card with more memory bandwidth to spank the 256 bit card, with decent res and 8xAA.
It's not. All the comparative testing is done with the same settings which means that TXAA isn't used.

I really hope nVidia reveals what the GK110 is capable of at launch, although business-wise I guess it doesn't make much sense.
It doesn't make much sense because GK110 is at least 6 months away. It is already in production but all the chips that they'll get in the following months will be used in supercomputers.

I'm not sure what everyone was expecting.

GTX 480 -> GTX 580 = ~20% faster
GTX 580 -> GTX 680 = ~20% faster

Prices have always been around $500-$600 for the top-end card.
480->580 isn't a generational change of production process.
8800->280 is, 285->480, 480->680.
Anyway, coming three months later than competition while being mostly on the same perfomance level means that the only thing you can set right to attract buyers is the price. NV seems to not get that. Let's see how fast these prices will fall with lower than expected sales.
 

Q8D3vil

Member
Have you ever heard of HD3D? There are fewer games with native support compared to 3d vision, but with 3rd party products such as TriDef3D most newer games can be rendered in 3d on both brands.

yeah, but its weak compared to 3d vision, especially now that 3d vision fans are fixing the 3d themself (mass effect 3 and darkness 2 in 3d).
 

sk3tch

Member
480->580 isn't a generational change of production process.
8800->280 is, 285->480, 480->680.
Anyway, coming three months later than competition while being mostly on the same perfomance level means that the only thing you can set right to attract buyers is the price. NV seems to not get that. Let's see how fast these prices will fall with lower than expected sales.

I addressed that above. Generational change does not equal insane performance - there are other factors.

And I will bet you the GTX 680 does very, very well. I own a 7970 and I am going to buy two 680s. I have allegiance to Nvidia when it comes to multi-GPU. They do it right.
 

1-D_FTW

Member
I'm not sure what everyone was expecting.

GTX 480 -> GTX 580 = ~20% faster
GTX 580 -> GTX 680 = ~20% faster

Prices have always been around $500-$600 for the top-end card.

Because it's a new architecture. When have new architectures been celebrating for providing a 20 percent boost? Never.

EDIT: Plus, the very fact Nvidia made a "flagship" with a pathetic 2GB VRAM tells you all you need to know. This was a last minute rebadging of a lower tier card.
 

bee

Member
Because it's a new architecture. When have new architectures been celebrating for providing a 20 percent boost? Never.

EDIT: Plus, the very fact Nvidia made a "flagship" with a pathetic 2GB VRAM tells you all you need to know. This was a last minute rebadging of a lower tier card.

2GB vram is pathetic? it's enough for me to play all but a few games at 5760 x 1080, plus they're making a 4gb version too
 

Sethos

Banned
Is there any game in existence that requires more than 2GB for single monitor use?

Plenty. I've hit the limit a few times on my 30" monitor, especially when it comes to high-resolution texture mods, 2GB just isn't cutting it. Lately I've messed around with some of the recent mods for S.T.A.L.K.E.R CoP and they are stuttering to hell and back because they are hitting the Vram roof constantly. Some games are working just below my 1,5GB roof and why the hell would I shell out that kind of money and not expect a big Vram boost as well?
 

Smokey

Member
2GB vram is pathetic? it's enough for me to play all but a few games at 5760 x 1080, plus they're making a 4gb version too

It kind of is when I have 580s that have 3GB on them and their competitor's cards have 3GB standard.

Excited about this reveal. Good to know that surround will run on one card too...was to be expected but good to see it confirmed. The GK110 part supposed to hit sometime late 2012?
 

Kreunt

Banned
I think ill be sticking with my 2x unlocked 6950's for a while yet, this new gen of gpu's doesnt seem like enough of an improvement for me to spend money on.
 

Durante

Member
I'm sure you can use more than 2 gb at 1600p with 8x SGSSAA forced in some games that feature nice sized textures.
Of course, but SGSSAA is basically equivalent to rendering at higher resolution in terms of VRAM use. I was just wondering if there were any recent PC games that actually feature high-res textures that would tax 2GB cards (without modding).
 

1-D_FTW

Member
Of course, but SGSSAA is basically equivalent to rendering at higher resolution in terms of VRAM use. I was just wondering if there were any recent PC games that actually feature high-res textures that would tax 2GB cards (without modding).

Is typical usage really the point, though? If you're just doing typical usage, why would anyone need to spend 500 dollars on a flagship? Flagship is for the extreme and in that sense, AMD's 3GB solution seems a lot more practical for that segment.
 

Stallion Free

Cock Encumbered
Of course, but SGSSAA is basically equivalent to rendering at higher resolution in terms of VRAM use. I was just wondering if there were any recent PC games that actually feature high-res textures that would tax 2GB cards (without modding).

BF3 and Witcher 2 with textures on ultra are the only ones I can think of that might get close.
 

Durante

Member
Is typical usage really the point, though? If you're just doing typical usage, why would anyone need to spend 500 dollars on a flagship? Flagship is for the extreme and in that sense, AMD's 3GB solution seems a lot more practical for that segment.
Yeah, it's pretty obvious (as if it wasn't obvious enough from the "GK104" codename) that this is actually not a high-end product. In a way we can blame AMD for it being sold as one :p
 

Lyonaz

Member
When will this card be released? I'm gonna build a gaming PC in May for Diablo 3 and Max Payne 3 and want it to be pretty high end so it can last me several years.
 

mhayze

Member
Every GeForce since 2MX supports two monitors. GK104 supports up to four.
I should have clarified, although I thought the rest of the sentence explained it - dual monitor "surround gaming". AFAIK, this is not possible.

It's not. All the comparative testing is done with the same settings which means that TXAA isn't used.
Which makes the benchmarks even more of a mystery. How does the GPU push the pixels that fast with the memory bandwidth constraints. Now looking forward to the hardware analysis after the reveal to explain this bit of black magic.

It doesn't make much sense because GK110 is at least 6 months away. It is already in production but all the chips that they'll get in the following months will be used in supercomputers.
That is the rumor, but I thought it was more 3 months away than 6 months away (June timeframe?)

480->580 isn't a generational change of production process.
8800->280 is, 285->480, 480->680.
Anyway, coming three months later than competition while being mostly on the same perfomance level means that the only thing you can set right to attract buyers is the price. NV seems to not get that. Let's see how fast these prices will fall with lower than expected sales.

I think you underestimate the pull of the Green Goblin :) Nvidia has almost always managed to make more money than ATI/AMD, even when they were slower / late to the party, and they're not slower in this case.
 

dr_rus

Member
I addressed that above. Generational change does not equal insane performance - there are other factors.
Usually there aren't. It's just that this time they've made such a good mid-range chip (or maybe it's AMD who's done a bad job with Tahiti) that they can use it against the fastest competing GPU.

When will this card be released? I'm gonna build a gaming PC in May for Diablo 3 and Max Payne 3 and want it to be pretty high end so it can last me several years.
March 22nd.

Which makes the benchmarks even more of a mystery. How does the GPU push the pixels that fast with the memory bandwidth constraints. Now looking forward to the hardware analysis after the reveal to explain this bit of black magic.
ROP numbers are the same (32) for both GK104 and Tahiti. Plus Kepler might have better caching schemes, better MSAA color compression etc. It's not impossible and this already was the case with G80 (384 bit) and R600 (512 bit).

That is the rumor, but I thought it was more 3 months away than 6 months away (June timeframe?)
Current rumour puts it around September.
 

pestul

Member
I think you underestimate the pull of the Green Goblin :) Nvidia has almost always managed to make more money than ATI/AMD, even when they were slower / late to the party, and they're not slower in this case.
Well, they're going to have to show new cards in at the price-point that most can afford before that will happen this generation. Neither company makes very much money off of the high-end (or anthing over $350 for that matter).

I was hoping they'd put this new card at about $450-$499.. but it doesn't seem like that is going to happen.
 
Of course, but SGSSAA is basically equivalent to rendering at higher resolution in terms of VRAM use. I was just wondering if there were any recent PC games that actually feature high-res textures that would tax 2GB cards (without modding).

BF3 and Witcher 2 with textures on ultra are the only ones I can think of that might get close.

Nope. Only Rage @ 1080p with forced 8k textures, forced caching, and 8x MSAA + transparency Supersampling goes over 2 GB. BF3 has dynamic texture caching so it can avoid breaking VRAM limits.
 
Top Bottom