• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

Deadstar

Member
You know friends, I had been an Nvidia guy then someone recommended ATI. So I bought a radeon card and I had nothing but problems with it and while they could have just been my issues, I will forever stick with Nvidia.
 

Corky

Nine out of ten orphans can't tell the difference.
Heavily overclocked vs. factory overclock is never a fair comparison.

Nor is comparing a 3gb card vs a 1.5gb one in eyefinity resolutions...

Something tells me stock 7970 vs stock 580 equal amount of ram, at the same resolution is another story - standard/common res, e.g 1080p.
 

Doc Holliday

SPOILER: Columbus finds America
I'd be genuinely shocked if their ( according to the op ) "mid-tier" gtx 660/760 beats a 7970.


that would be absolutely insane! I just bought a 7970 too lol. One question though, why would nvidia keep quiet while AMD launched their flagship card? If they are that close to launch all they had to do was release some benchmarks and it would have been enough to stop people from purchasing the 7970. Makes no sense to me :/
 

sk3tch

Member
Nor is comparing a 3gb card vs a 1.5gb one in eyefinity resolutions...

Something tells me stock 7970 vs stock 580 equal amount of ram, at the same resolution is another story - standard/common res, e.g 1080p.

There's an answer for that - there is no other GTX 580 that can do 3 or more monitors on a single card. Nvidia limited them all to two (unless you do SLI) and Galaxy's MDT is the first to have more than that. So there's no way to do a 3GB to 3GB comparison.

And you're kidding yourself if you think a 7970 and a GTX 580 3GB are close...esp. at 1080p...one resolution where 3GB is not needed.
 

Corky

Nine out of ten orphans can't tell the difference.
There's an answer for that - there is no other GTX 580 that can do 3 or more monitors on a single card. Nvidia limited them all to two (unless you do SLI) and Galaxy's MDT is the first to have more than that. So there's no way to do a 3GB to 3GB comparison.

Correct, which is why I think it's a stupid benchmark to be used in order to relatively compare the two cards. It's an extremely situational scenario and/or niche market in the grand scheme of things.


And you're kidding yourself if you think a 7970 and a GTX 580 3GB are close.

People talking about sli 3gbs being close to 1 7970 are kidding themselves, and no I don't think I'm out of line thinking a 7970 is "only" 25% faster than a 580, stock.
 

x3sphere

Member
Judging from the games Hardocp tested anyway, they weren't VRAM starved. BF3 can be in multiplayer but the single-player portion uses much less resources. And it's pointless to bench the multi portion as gameplay can vary significantly. If the games were VRAM starved, there would be massive frame rate drops on the minimum end, which I'm not seeing.
 

squidyj

Member
Why is there only a 30% chance that benchmarks like this will include a Source game. Couldn't give less of a crap about whatever engine an F1 game is on, but every benchmark test seems to go out of their way to include it or crap like it

What would be the point?
 

sk3tch

Member
Correct, which is why I think it's a stupid benchmark to be used in order to relatively compare the two cards. It's an extremely situational scenario and/or niche market in the grand scheme of things.

Except they didn't...they did both Eyefinity and no Eyefinity comparisons. So you really have no issue here. Link for reference: http://www.hardocp.com/article/2012/01/09/amd_radeon_hd_7970_overclocking_performance_review/1. It's around 30% faster in standard (non-Eyefinity).

People talking about sli 3gbs being close to 1 7970 are kidding themselves, and no I don't think I'm out of line thinking a 7970 is "only" 25% faster than a 580, stock.

Never said anything about 580 SLI of any flavor comparing to a 7970 in benchmarks.
 

SRG01

Member
Anyway, I wont buy Nvidia because AMD's drivers are so much better, but I hope the competition brings prices down.

Wh... Oh, I see that people have beaten me to it.

At any rate, was this news surprising to anyone? AMD and nVidia have been leapfrogging each other for years.
 

Corky

Nine out of ten orphans can't tell the difference.
Except they didn't...they did both Eyefinity and no Eyefinity comparisons. So you really have no issue here. Link for reference: http://www.hardocp.com/article/2012/01/09/amd_radeon_hd_7970_overclocking_performance_review/1. It's around 30% faster in standard (non-Eyefinity).

Ok, around 25 like I said then. No problem here.


Never said anything about 580 SLI of any flavor comparing to a 7970 in benchmarks.

And what made you think I was talking about you? :S

"I hope you don't mean to say that one 7970 destroys two GTX 580s."

"In some cases, it comes close."

Which spurred my original "useless benchmark" comment.
 

theBishop

Banned
Wh... Oh, I see that people have beaten me to it.

At any rate, was this news surprising to anyone? AMD and nVidia have been leapfrogging each other for years.

In hardware, yes. But Ati's software has always been awful. They still bundle crap in the installer, which is unbelievable.
 

tokkun

Member
I'd be genuinely shocked if their ( according to the op ) "mid-tier" gtx 660/760 beats a 7970.

I agree, but the OP may have been indicating that if the GK104 handily beats AMD's mid-tier, then the high-end part (GK100 iirc) will likely do the same with the 7970.

The article itself is vague about which AMD part GK104 is beating. My assumption would be that they are comparing it to unpublished numbers for the 7950 or 7870, but that's obviously just speculation.
 

jwhit28

Member
Wh... Oh, I see that people have beaten me to it.

At any rate, was this news surprising to anyone? AMD and nVidia have been leapfrogging each other for years.

You are right, it has actually been quite stable.

AMD makes it to the market first, Nvidia arrives fashionably late

AMD has the best dual card, Nvidia has the best single card

AMD usually has a few driver support problems, Nvidia cost a little more and uses a little more power

AMD has Eyefinity, Nvidia has much better 3D gaming support


The only wild cards have been anti-aliasing support (which to me seems even right now, especially since most use MSAA+FXAA) and SLI/Xfire performance.

I still don't see how people have such strong bonds to either side. What ever card has the best performance at $250 is mine in June.
 

grendelrt

Member
I agree, but the OP may have been indicating that if the GK104 handily beats AMD's mid-tier, then the high-end part (GK100 iirc) will likely do the same with the 7970.

The article itself is vague about which AMD part GK104 is beating. My assumption would be that they are comparing it to unpublished numbers for the 7950 or 7870, but that's obviously just speculation.

Yeah thats what I was assuming as well. We will have to wait until march-ish to find out.
 
So, what do you guys think are the chances Sony will use a Kepler based GPU on the next Playstation?

Cause if this "report" by CD is true, a console with this GPU architecture would be awesome, even if it was a mainstream part.
 
why do people buy top end amd cards over nvidia ones?

im genuinely interested, because from what i've seen nvidia has better drivers resulting
in better performance, better antialiasing support and compatability with numerous profiles,
and they seem to outperform the amd cards whilst costing around the same.



They used to offer better price/performance. That is, until they decided to go fucking crazy with the price on the 7970.
 

LiquidMetal14

hide your water-based mammals
I don't understand what's so hard to believe on AMD cards and drivers being pretty good. There are issues on nvidia too so I don't know why there's mud slinging going on mainly over to AMD for their drivers. I've never had issues and most of the ones I've read have been XF related. I've also never had issues with Nvidia drivers too. It's like spreading fud. I swear, every time we get an AMD thread we should shit talk nvidia just because. Drop this stupid baseless argument. It's much better now. It's irritating enough for me to word it in such a way.

Nvidia and AMD are not perfect. One isn't head and shoulders better than the other in every department. Performance is always swinging and happens to be in AMD's favor right now. Any enthusiast should be happy. I'm seeing more HW allegiance now more than ever. It's so weird. Even on youtube or anywhere else including here.
 
Call me when nVidia allows Multi Monitor setups to run on one GPU, I drive this with 1*6950 :

B0nQP.jpg

That...is...glorious.

However the topmost monitor would surely cause major neck pain after a while.
 
why do people buy top end amd cards over nvidia ones?

im genuinely interested, because from what i've seen nvidia has better drivers resulting
in better performance, better antialiasing support and compatability with numerous profiles,
and they seem to outperform the amd cards whilst costing around the same.

Because amd got there earlier I guess.

I think this is fantastic news, it means AMD will have to pick up slack with the 7870 cards and lower the prices of all their cards drastically.
Let the mid range price wars commence!

I really resent them for their current monopoly price gouging, and I will not forget by the time they lowered prices and kepler is out.
Team green ftw this time (never thought I'd say that after the hilarious geforce 4 mx naming scheme -please consider geforce 2 MX was the top end card, geforce 4 mx were complete trash equivalent of something you'd find integrated a few years later) .
 

Shambles

Member
nVidia typically has the fastest single GPU, while AMD/ATI offered better bang/buck. Can't wait for Kepler to come out to drive some actual competition.
 
cool cool charlie isn't shit talking nvidia, amd has good graphics drivers, next you'll tell me this new graphics card will only need power from the PCIE 16x bus and will be so quite you think it's passive cooled.
 

Cipherr

Member
That...is...glorious.

However the topmost monitor would surely cause major neck pain after a while.

Sports on top monitor. Something you can hear, glance at from time to time. Work on 2 screens and browse a website on another. Jesus, I cannot wait. Although I dont think Im going to go with 4 on my next build. Finding a quality monitor mount for 3 screens is easily done, the 4th though? Id probably just buy a medium sized TV for my office instead.
 

artist

Banned
They used to offer better price/performance. That is, until they decided to go fucking crazy with the price on the 7970.
AMD has priced their cards based on market conditions and product positioning - yes the same thing that applied to 4800 series, 5800 series, 6900 series applies to the 7970 as well.
 
I'm not surprised at this at all. I said before it was a bad move for Mac to go ATI. I think ATI is fine for budget gaming but, I would never dream of using them for anything else.
 

Grymm

Banned
Here's a few things I don't get.

1. Supposedly AMD is doing the GPUs for the next XBox and the WiiU. So they must get a ton of money for that. If they're swimming in all this money then why can't the products and service/drivers be better?

2. AMDs support for new games in the past year+ has been terrible. Absolutely friggin awful. But if they're the ones doing the next XBox and WiiU GPUs wouldn't that mean there's a better chance that in the next console gen their PC cards will have better performance out of the gate because ports to PC will mostly be made for their hardware instead of nVidias?
 
I'm not surprised at this at all. I said before it was a bad move for Mac to go ATI. I think ATI is fine for budget gaming but, I would never dream of using them for anything else.

That move didn't affect Apple whatsoever since their computers focus on, at best, mid-range cards that are further saddled by horrible OpenGL drivers in OSX.
 
AMD has priced their cards based on market conditions and product positioning - yes the same thing that applied to 4800 series, 5800 series, 6900 series applies to the 7970 as well.



I paid 211 euro's for my Asus 4870 in August of 2008. Two hundred and eleven euro's.
 
Video card that's not out yet will outperform video card that's out now.

News at 11.

If only that was still how things went.

People here were cooing over the 7970 being 30 percent faster than the gtx580 which is over a year old now, like it's some fucking miracle:p

No backsies now.

Exactly fallout_NL, the 4870 was priced really nicely. I only paid 140 euros for mine a few months later.
Amd is being greedy with the 7970 (and doing a semi paper launch by sending out so few of them apparently).
That was the edge amd had they offered value.
I had to wait till my 4870 512MB card was basically becoming too much of a bottleneck before I could consider the 68xx cards value, let alone the 7970.

My loyalty is only tied to how well I'm treated, (assuming the product is decent and not broken).
The message of we'll milk you when we get the opportunity arrived loud and clear (not that nvidia is any better, but hey they have the better cards lately so it's an easy choice unless nvidia does something stupid).
 

Pimpbaa

Member
Why is there only a 30% chance that benchmarks like this will include a Source game. Couldn't give less of a crap about whatever engine an F1 game is on, but every benchmark test seems to go out of their way to include it or crap like it

Maybe in the future people making these benchmarks will contact you to find out what games you deem acceptable to benchmark.
 

Hellish

Member
If the GTX 660/760 beats the 7970, I will actually laugh hard, but I find it hard to believe, but if it does I guess better prices for everyone so I hope it does.




Call me when nVidia allows Multi Monitor setups to run on one GPU, I drive this with 1*6950 :

B0nQP.jpg


Congrats man, aside from multitasking on multiple monitors you can't really do much with 1 6950.
 

Doc Holliday

SPOILER: Columbus finds America
I have a 7970 but i'm very curious to see how Nvidia pulls this off if true. Less ram, less bandwidth and still manage to be faster?! Hopefully they don't turn out like the first Fermi, those things were loud, large and hot as fuck.
 
So, what do you guys think are the chances Sony will use a Kepler based GPU on the next Playstation?

Cause if this "report" by CD is true, a console with this GPU architecture would be awesome, even if it was a mainstream part.

If, (and that's a big and unlikely "if") Sony goes with nvidia then I'd say that's a safe bet.

Nvidia's next gen Kepler-based mobile/notebook GPU's are set to hit the market in Q3 this year. By mid-late 2013 they should have the ability to produce a nice custom Kepler chip for a console.
 
You are right, it has actually been quite stable.

AMD makes it to the market first, Nvidia arrives fashionably late

AMD has the best dual card, Nvidia has the best single card

AMD usually has a few driver support problems, Nvidia cost a little more and uses a little more power

AMD has Eyefinity, Nvidia has much better 3D gaming support


The only wild cards have been anti-aliasing support (which to me seems even right now, especially since most use MSAA+FXAA) and SLI/Xfire performance.

I still don't see how people have such strong bonds to either side. What ever card has the best performance at $250 is mine in June.



well for the Nvdia side it's extras like PhysX and CUDA and things like that.
 

Hari Seldon

Member
I think the driver issues people talk about with AMD are in crossfire. I have never had AMD driver problems when using a single GPU. I don't really care though, I will switch in a heartbeat, but AMD cards seem to be engineered very well with their relatively low power consumption.
 
Top Bottom