• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

McHuj

Member
Do we know anything about the systems that were used in the supposed benchmark?

It's been 2-3 months since the 79xx reviews, drivers could have changed, motherboards, memories, cpu. A more Nvidia friendly setup could have been chosen.

I'm assuming all the benchmarks were rerun on an identical system in each review (other than GPU and it's drivers ofcourse)
 

-SD-

Banned
http://blog.renderstream.com/2012/03/pre-order-gtx-680-and-the-latest-intel-xeon-based-systems/

iboulHP5Igh33K.png
 

1-D_FTW

Member
That's scandalous. I never liked Tom's because they felt like a poor man's Anandtech.




I have 3 spare GPU's in case I need to buy a new one. I have a spare 8800 GT, my iGPU on my 2500k, and a spare GTX 570 1.28 GB sitting in a box. I plan to swap one of those if I sell my 2.5 GB GTX 570 and upgrade to a 680.

I quit trusting them during the AMD heydey. Back when AMD CPUs were cheaper/more powerful/more efficient and Tom's hardware was clearly on Intel's payroll and would flat out deny reality. And get really aggressive about people who questioned their motives. From that point forward, I trusted nothing from them.
 

artist

Banned
Tom's is now back pedalling, in full damage control. Review numbers will be different?

Images of Leaked Tom's Hardware charts have been deleted! These are UNAPPROVED images taken from Tom's Hardware's system. Don't post any more images of the charts from the leaked info, until the official release of the GTX 680. The NDA has not passed for the GTX 680 and this information being released will not be tolerated at Tom's Hardware.

Thanks!

Tom's Hardware Moderators!
 

-SD-

Banned

I don't know if I trust that. It could just be some marketing guy fumbling right? Even then it's just for their implementation.
Are those full CUDA cores or did nVidia opt for more simple ones similar to AMD?
1536 is the correct amount, according to this:


More here: http://www.neogaf.com/forum/showthread.php?t=459499&page=19

This is where I got the link: http://forums.cgsociety.org/showthread.php?f=59&t=1042011
 

pestul

Member
Sounds like a silly setup (Renderstream) given GTX680's gimped compute ability (going by Tom's leaked slides lol).. 7970s would be much more sensible in a workstation. :S
 

-SD-

Banned
Sounds like a silly setup (Renderstream) given GTX680's gimped compute ability (going by Tom's leaked slides lol)..
I don't believe that NVIDIA have gimped it one bit. On the contrary - just think about how big their GPGPU business is.
 

Durante

Member
Sounds like a silly setup (Renderstream) given GTX680's gimped compute ability (going by Tom's leaked slides lol).. 7970s would be much more sensible in a workstation. :S
RenderStream stuff isn't for general GPGPU, it's for... rendering. And in that use case gimped double precision doesn't matter much.

I don't believe that NVIDIA have gimped it one bit. On the contrary - just think about how big their GPGPU business is.
Their GPGPU business is profitable because they are selling the same chips for 10x the price. And that's why they "gimp" the GPGPU capabilities of their non-Tesla cards.
 

dr_rus

Member
When I see those words, I think 199.99 or 219.99. Not 299.99. If it's the low 200s, I'll give you an e-kiss. Seems way too hard to believe. But I want to. I wanna believe, GAF. Believe.

EDIT: Grr. Misread and thought you said 670 was going to be priced at that point. If it has 570 level performance, it better have really low power numbers. Low power and price is the only combo that would temp me to make that type of upgrade.
670 will be faster than 7950 which is a little bit faster than 580.
GK106 should be on par with 570 but with 680 being $500 I don't think that the faster GK106 card will cost $200. $300 is more likely.

Are we getting anything but the 680 these next couple of weeks? I'm looking for a 250-300 dollar GPU, and if we're not getting a 660 or something for that price, I'll just go with a 7850...
Next couple of weeks -- no. Next couple of months -- yes.

AMD needs to get a decent performance driver out for the 79xx series (5-10% increase) to compete in the 1080p segment.
AMD isn't the only one who can make decent performance drivers. Any driver-based performance improvements will be more or less equal for both AMD and NV.

So any idea on when the new Nvidia GPUs are actually coming out? I've been dying to upgrade my 5770 for about a year--ATI's had a shitty couple of years for compatibility--and I'd either get a 570 or 680, depending on the price of each at the 680's launch.
Tomorrow for 680.

I'm well aware the 1500 number is correct, just not if that they are as beefy as the old style cores.
The cores are the same, you can't get any less "beefy" than a MADD core. They are organized differently now though.

DP is gimped no doubt but I dont think compute performance is optimized yet, atleast it shouldnt be that low.
GK104 is a gaming oriented GPU which was meant for a mid range market (GF114 replacement). As it is it's not compute oriented at all thus it being worse at compute than Tahiti and on par with Pitcairn isn't that surprising.
GK110 is the one which will be compute oriented. And yeah, you laughed, but it is in production already.
 
I'm well aware the 1500 number is correct, just not if that they are as beefy as the old style cores.

They're not. They're a simpler design. If it was 1536 CUDA cores, Fermi-style, they'd need at least a 384-bit or 512-bit bus to have the memory bandwidth to keep up with that amount of processing power. Not to mention the card would be HUGE.
 

Erasus

Member
So, when will that architechture hit 100$? Not that card, but somthing based on that. Prices for high end stuff is so dumb, its not THAT much better than consoles.
 

mkenyon

Banned
So, when will that architechture hit 100$? Not that card, but somthing based on that. Prices for high end stuff is so dumb, its not THAT much better than consoles.
120hz, 120fps, 1080P. That's an order of magnitude better than console output.
 
So, when will that architechture hit 100$? Not that card, but somthing based on that. Prices for high end stuff is so dumb, its not THAT much better than consoles.

You realize a GTX 680 is roughly around 10-15x more powerful than a 360/PS3 GPU, if not more? It's far better than the hardware in consoles. In fact, GTX 680/7970 are most likely better GPU's than what will be in Xbox 3/PS4.
 

artist

Banned
GK104 is a gaming oriented GPU which was meant for a mid range market (GF114 replacement). As it is it's not compute oriented at all thus it being worse at compute than Tahiti and on par with Pitcairn isn't that surprising.
No one is contesting that and why. Last time however the x80 model of Nvidia served well as Tesla product. That isnt the case this time around.

What I was suggesting was that despite the DP being gimped, the compute performance in other workloads will not be that dismal.

And yeah, you laughed, but it is in production already.
lol

There is a difference between sampling and production.

So, when will that architechture hit 100$? Not that card, but somthing based on that. Prices for high end stuff is so dumb, its not THAT much better than consoles.
Nvidia is filling the low-end ($100) with Fermi shrinks, so not any time soon.
 

Hazaro

relies on auto-aim
So, when will that architechture hit 100$? Not that card, but somthing based on that. Prices for high end stuff is so dumb, its not THAT much better than consoles.
Buy a 6870 if you want good value.
Should net 60FPS at 1080p with good settings on a broad range of titles.
 

Erasus

Member
120hz, 120fps, 1080P. That's an order of magnitude better than console output.

I do 1080p on lots of titles on a 4830 at 30-50fps. Sure it is better, but its not really "This was worth 500" better. I plugged in my PS3 in the same monitor (24 1080p) and KZ3/Uncharted 2 looked amazing too. But Im going off topic.
 

Erasus

Member
Plus, those PS3 games you played are being rendered at lower than 1080p, despite being on a 1080p display.

Oh I know they are 720p and then upscaled. Im not a retard. Sure real 1080p looks way clearer, but thats really it. I have not done a comparision of the same gam, but say KZ3 in 720p vs Singularity/L4D2/Crysis 2 etc in 1080p is not a OMGWTFBBQHAX difference. And spending 200 more on a comp I would want to get that.
 
Oh I know they are 720p and then upscaled. Im not a retard. Sure real 1080p looks way clearer, but thats really it. I have not done a comparision of the same gam, but say KZ3 in 720p vs Singularity/L4D2/Crysis 2 etc in 1080p is not a OMGWTFBBQHAX difference. And spending 200 more on a comp I would want to get that.

Come on now, crysis 2 in 1080p is way better. 1080 vs 720 is like night and day.
 

DieH@rd

Banned
The folks at B3D are talking Tom's apart, at this rate we need to discard the review.

7970/7950 becoming 10% slower, exact same benchmark and settings.
BeE2n.jpg

Nvidija is really enjoying this lies at initial reviews. Charlie will rip them a new one. :D
 

dr_rus

Member
They're not. They're a simpler design. If it was 1536 CUDA cores, Fermi-style, they'd need at least a 384-bit or 512-bit bus to have the memory bandwidth to keep up with that amount of processing power. Not to mention the card would be HUGE.
They're the same. Compute density is rising much faster than bandwidth. It has been like this since the beginning of computers. As for the card being huge there are many changes in things surrounding compute cores that make this number possible in approximately the same transistor budget as that of GF110. Loosing hot clocks for cores is one of such things.

What I was suggesting was that despite the DP being gimped, the compute performance in other workloads will not be that dismal.
DP isn't the only thing that gets gimped in gaming grade GPUs which aren't made for Teslas. I think it's pretty telling that they're not launching a new CUDA version with GK104.

There is a difference between sampling and production.
Really? 8) So should I say again that it's already in production? As in not sampling but is already in production.
The question now is when (and IF) will they decide to make a GeForce on it? That will depend on TSMC's capacities and AMD's ability to push something faster than 680 out there. Right now they seem to feel very comfortable selling a 294 mm^2 GPU for cards retailing for $500/€500 and don't see much reason to make a GK110 based GeForce at all.
 

mkenyon

Banned
Oh I know they are 720p and then upscaled. Im not a retard. Sure real 1080p looks way clearer, but thats really it. I have not done a comparision of the same gam, but say KZ3 in 720p vs Singularity/L4D2/Crysis 2 etc in 1080p is not a OMGWTFBBQHAX difference. And spending 200 more on a comp I would want to get that.
Gotcha. Well, like I said, not just about 1080p, but the 120hz makes a huuuuuuuuge difference. One of those weird issues that straddle the line between objective and subjective. Not that big of a deal to you, so no worries.
 

artist

Banned
DP isn't the only thing that gets gimped in gaming grade GPUs which aren't made for Teslas. I think it's pretty telling that they're not launching a new CUDA version with GK104.
Isnt that obvious?

Really? 8) So should I say again that it's already in production? As in not sampling but is already in production.
The question now is when (and IF) will they decide to make a GeForce on it? That will depend on TSMC's capacities and AMD's ability to push something faster than 680 out there. Right now they seem to feel very comfortable selling a 294 mm^2 GPU for cards retailing for $500/€500 and don't see much reason to make a GK110 based GeForce at all.
Ok, keep going on, I guess if you keep repeating it might actually become true!
 

SapientWolf

Trucker Sexologist
Oh I know they are 720p and then upscaled. Im not a retard. Sure real 1080p looks way clearer, but thats really it. I have not done a comparision of the same gam, but say KZ3 in 720p vs Singularity/L4D2/Crysis 2 etc in 1080p is not a OMGWTFBBQHAX difference. And spending 200 more on a comp I would want to get that.
It is if you're using a computer monitor.
 

artist

Banned
Apparently not if you're thinking that GK104's compute performance should be higher than that of Pitcairn just because it has more cores.
Quite a few jumps you are making there. You'll see it's compute performance quite a bit higher than Pitcairn, irrespective of the amount of CCs argument.
 

dionysus

Yaldog
I do 1080p on lots of titles on a 4830 at 30-50fps. Sure it is better, but its not really "This was worth 500" better. I plugged in my PS3 in the same monitor (24 1080p) and KZ3/Uncharted 2 looked amazing too. But Im going off topic.

What are you talking about? the 4830 was a low end card released in fucking 2008. It is an order of magnitude less powerful than the card we are talking about here. Comparing the performance you would get out of a 680, a high end card in 2012, with a 4830, a low end card in 2008, just so you can say it is not that big an upgrade over a PS3 or 360 is laughable.

Please tell me I misunderstood your post.
 
They're the same. Compute density is rising much faster than bandwidth. It has been like this since the beginning of computers. As for the card being huge there are many changes in things surrounding compute cores that make this number possible in approximately the same transistor budget as that of GF110. Loosing hot clocks for cores is one of such things.

You're not even sure the SP's aren't hotclocked. Last rumor I checked, the SP's were hotclocked.

And eventually a card with massive processing power becomes bandwidth starved. Even Fermi was bandwidth starved somewhat, and I think GK104 will be even worse. Luckily they're using RAM that can hit ridiculous clocks, but I can see GK110 performing much better with a 384-bit or 512-bit bus. d2
 

DonasaurusRex

Online Ho Champ
I do 1080p on lots of titles on a 4830 at 30-50fps. Sure it is better, but its not really "This was worth 500" better. I plugged in my PS3 in the same monitor (24 1080p) and KZ3/Uncharted 2 looked amazing too. But Im going off topic.

...wait how much did you pay for a 4830???
 
BP, hot clock is gone.


W1zzard (admin of TechPowerUp) and owner of GPUZ, removed the "shader clock" and replaced it with "boost" in the latest revision.

Gotcha.

I still think this turbo boost thing is kinda weird for a GPU, and might make overclocking a bit more annoying if it isn't implemented well... I am guessing Nvidia has it so that it's smart enough to turn off during heavy overclocking without spiking when you're trying to push it to your highest stable max. Then again, it is a great idea for power efficiency.

I really think a decent Kepler will OC even higher than a 7970/7950. Their stock voltage appears to be very low and I think they should grab a solid 300-400 Mhz on the core. And I hope the RAM chips hit 7-8 Ghz with ease.


I do 1080p on lots of titles on a 4830 at 30-50fps. Sure it is better, but its not really "This was worth 500" better. I plugged in my PS3 in the same monitor (24 1080p) and KZ3/Uncharted 2 looked amazing too. But Im going off topic.

You're trollin with this. PC games have worlds better image quality, assets, texture resolution, antialiasing, and MUCH higher frame-rates. If you have decent eyes and a decent monitor/HDTV, the difference is astounding.
 

Hawk269

Member
Nice benches for 1080P, might have to pick this up now. Would love to see a witcher 2 @1080p ultra/4xAA benchmark.

Joy at being able to play metro 2033 @ 60FPS with max settings and 4xAA @1080p.

Bingo..But I want to see Witcher 2 at 1080p with the Ubermode enabled. On my current 580 SLI cards, I get around 28-45fps...based on what I can see on the the Tom Hardware benches and knowing what my FPS are in the games they used, it seems like 2x680's will finally deliver me solid 60fps in Witcher 2 and in ubermode...at least I hope.
 

Hawk269

Member
In all honesty, as someone who has been eagerly waiting for the next nvidia cards, I don't think "low/mid/high"- end/range means shit anymore if this is the pricing model they're using.

This is an enthusiast top of the line card retailing for ~$550. Their 680/790/4200 ti/whatever kepler flagship can be twice as fast as 7970 but what's the point for the 99.99% of the "enthusiast" pc market if it ends up with a price tag close to a grand?

Fuck I miss the days of the gtx 460.

Well Corky, if a $999.99 single GPU Card can outperform 2 standard cards in SLI, then I will dive in. I am also refererring to a single card being able to outperform 2x580's since they are around $500 each for 3gb versions. For me and I think many others, we have always wished for a single gpu solution since ideally it is the best way to go, but for me SLI has been very easy with minor issues and that is the only way I could achieve the FPS I want with the IQ level I want...but if someone releases a Super Single GPU Card with the horsepower of 2x580's and a bit more of performance, I would jump in at a grand...easily.

Hell, with the 2 580 3gb Classifieds, I paid close to $580.00 for each of them, with tax and shipping I think it came out close to $1200.00...so I already did it once and will do it again for a single gpu solution.

Am I crazy...pretty much...:)
 
Bingo..But I want to see Witcher 2 at 1080p with the Ubermode enabled. On my current 580 SLI cards, I get around 28-45fps...based on what I can see on the the Tom Hardware benches and knowing what my FPS are in the games they used, it seems like 2x680's will finally deliver me solid 60fps in Witcher 2 and in ubermode...at least I hope.

I am not sure I understand the obsession with uber mode. It doesn't really look that much better, it's just supersampling. And you realize it can be set to Uber x2, x3, x4, and so on and so forth...? So actually, there's still levels of uber beyond the default. It's a pointless goal to spend money on.
 

gatti-man

Member
I am not sure I understand the obsession with uber mode. It doesn't really look that much better, it's just supersampling. And you realize it can be set to Uber x2, x3, x4, and so on and so forth...? So actually, there's still levels of uber beyond the default. It's a pointless goal to spend money on.

Agreed.
 

elty

Member
Well, the benchmark choice is pretty questionable
- no multi monitor
- HAWX 2 , WOW (games that known to favor nvidia)
- 1680x1050 who will spend $500 on a video card with just a 20" Monitor?

But it looks like at the very least it is competitive with the 7970 while using less power. I hope they will release the lower end variant soon and push the price of 77xx/78xx down.
 

Hazaro

relies on auto-aim
Pasting this over from [H] since people ask about naming:

Jorona said:
Yeah, Nvidias typical naming convention is G(Core Designation)(Revision)(Performance Class)
And before GT200 it was just G(Revision)(Performance Class)

0 is highest performance class, 8 lowest.

So GF100 was Graphics Core Fermi Revision 1.0 Top Performance Class (GTX480)
GT215 was Graphics Core GT Revision 2.1 Low-Mid Performance (GT240)
G80 was Graphics Core Revision 8 Top Performance Class (8800GTS/GTX/Ultra)

GK104 Is Graphics Core Kepler Revision 1.0 High Mid Performance Class. It should get the 660 designation, but its being pushed into the high end role. Its eqather because GK100 or GK110 have issues, or Nvidia truely believes they don't need a bigger gun for this fight.
 

artist

Banned
I don't get a sense that they're denying or downplaying the numbers, just that they don't want to get sued for breaching the NDA.
You are right. Kyle Bennet confirmed it ..

Kyle_Bennett HardOCP Editor-in-Chief said:
While I cannot vouch for the validity of the data, and I never would for another website, I can tell you this. Chris Angelini THG Editor-in-Chief did verify that "someone accessed our internally-facing CMS and exported all of the charts that I had uploaded in preparation for my GTX 680 review." So those are THG charts and such.

Still the selection of games are not the usual:

Games used in regular THG reviews:

Battlefield 3
Metro 2033
Aliens Vs. Predator
Crysis 2
Mafia 2
GTA IV
Batman: Arkham Asylum
DiRT 3
StarCraft II
The Elder Scrolls V: Skyrim
World Of Warcraft

Games used in GTX680 THG review:

Battlefield 3
Metro 2033
Crysis 2
DiRT 3
HAWX 2
World Of Warcraft

We'll get real reviews soon, just a few hours to go!
 

Hawk269

Member
I am not sure I understand the obsession with uber mode. It doesn't really look that much better, it's just supersampling. And you realize it can be set to Uber x2, x3, x4, and so on and so forth...? So actually, there's still levels of uber beyond the default. It's a pointless goal to spend money on.

Gaming on my HDTV, I see a significant improvement in IQ in Uber mode than in running everything at max except uber. For me, it does look much better in uber mode. While I did not know that there are levels of Uber above the normal, the fact that I cant run uber at 60fps even with 2 580's which to me is what I would like since I do think it looks better.

I am not sold on these new cards or the AMD offerings as of yet so I think I can do the wait and see, especially since I prefere eVGA due to service and the partners always come up with some cool new versions of the main card several months after.
 
Top Bottom