• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

sk3tch

Member
if you want stable 60 fps FOR ANY game, then yes, it's worth it. Most games run at 60 fps for me, BF3 runs at 40-60 fps with dips into 30s which is fine by me. These are not fast-paced games. Racers, twitch shooters, slashers, fighters, brawlers, platformers should run @ 60 fps, I agree. I can't remember a single taxing game in these genres which was released recently.

Different strokes for different folks. I'm not good, but I like to game with "competitive"-grade equipment - so 120 FPS / 120 Hz / Vsync / 1080p would be ideal. But I've mostly given up on that with the games I want to play (especially since SLI/CFX add micro-stuttering and other issues and are pretty much required for that level). Online FPS MP at lower than 60 FPS doesn't work for me. Too much jank.
 
Online FPS MP at lower than 60 FPS doesn't work for me. Too much jank.


The more people I read with opinions similar to yours, the more convinced I am that the world needs to be destroyed in nuclear fire. In my youth I was playing sub-30fps Counter Strike with 200-300ms ping via dial-up most nights of the week. Even though I (obviously) do prefer it when possible, in my opinion 60fps is the most overrated thing in recent memory.
 

sk3tch

Member
The more people I read with opinions similar to yours, the more convinced I am that the world needs to be destroyed in nuclear fire. In my youth I was playing sub-30fps Counter Strike with 200-300ms ping via dial-up most nights of the week. Even though I do prefer it when possible, in my opinion 60fps is the most overrated thing in recent memory.

Yes, a lot of things were different back then. Glide, Voodoo 3 3000s, Tribes, and Celeron 300As...good times.
 

subversus

I've done nothing with my life except eat and fap
The more people I read with opinions similar to yours, the more convinced I am that the world needs to be destroyed in nuclear fire. In my youth I was playing sub-30fps Counter Strike with 200-300ms ping via dial-up most nights of the week. Even though I (obviously) do prefer it when possible, in my opinion 60fps is the most overrated thing in recent memory.

seriously 30 fps was like 60 fps for me when I was a kid and I spent most of my time on PC (I haven't played much MP though except Q1-3 and did it at LAN clubs where PCs were built for that kind of gaming so framerate was pretty smooth there). I don't even remember people talking about games being smooth. Most people I know talked about getting acceptable framerate while maxing out graphics. 60 fps framerate has become possible in recent years thanks to consoles.

Well then, that's not maxed out. :p Advanced DOF cuts performance in half. It does look quite stunning, though.

it makes me sick (not literally) so I turned it off.
 

Celcius

°Temp. member
Do you think the gtx 680 will really be $650? I wonder what the lightning XE will cost lol. I hope there's no crazy price gouging the first month it comes out.
 

Hazaro

relies on auto-aim
Do you think the gtx 680 will really be $650? I hate to think what the lightning XE will cost lol. I hope there's no crazy price gouging the first month it comes out.
If it's the fastest card by 20%+ and OC's as well as a 7970 they can charge whatever they want. :/
 

Zzoram

Member
Do you think the gtx 680 will really be $650? I wonder what the lightning XE will cost lol. I hope there's no crazy price gouging the first month it comes out.

When has a hot new GPU launch not had price gouging while supplies are limited?
 
So can anyone without a Nvidia or AMD bias tell me what graphics card to get, I'm looking for a top end graphics card that will last a few years and also won't blow my budget. Do I wait for the next line of AMD/Nvidia cards, or go with something thats out now?
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
So can anyone without a Nvidia or AMD bias tell me what graphics card to get, I'm looking for a top end graphics card that will last a few years and also won't blow my budget. Do I wait for the next line of AMD/Nvidia cards, or go with something thats out now?

GTX 580 or AMD 7970. AMD is faster right now by quite a bit but Nvidia is about to refresh.

For a cheaper solution:
1. AMD 7950, OC it and consider getting a second in the future to crossfire
2. Wait for Nvidia's release in 1-3 months.
 
GTX 580 or AMD 7970. AMD is faster right now by quite a bit but Nvidia is about to refresh.

For a cheaper solution:
1. AMD 7950, OC it and consider getting a second in the future to crossfire
2. Wait for Nvidia's release in 1-3 months.
Yeah I realise AMD is killing it right now, but do you think its worth it to wait for Nvidia's refresh? I'm not a fan of multi-gpu, I rather stick to one, I'm not sure why though.
 

TheExodu5

Banned
Yeah I realise AMD is killing it right now, but do you think its worth it to wait for Nvidia's refresh? I'm not a fan of multi-gpu, I rather stick to one, I'm not sure why though.

If you don't mind waiting, then wait. Rumors are suggesting the new NVidia cards might be very fast. No way to know until they're out, though.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
Yeah I realise AMD is killing it right now, but do you think its worth it to wait for Nvidia's refresh? I'm not a fan of multi-gpu, I rather stick to one, I'm not sure why though.

If you can wait, then wait. Nvidia seems to have a bit better drivers and Physx. They tend to have the best performance. AMD tends to be lower power, heat, and cost, making multi-GPU easier (with radeon pro or catalyst 12.1 profile tweaks). If Nvidia has the best performance after they refresh, that might be your best choice. Currently, no contest 7970 is the best performer.

With rumors of meager 6670s in next gen consoles and my ignorance in the PC lineup this year, I don't see either a GTX 580 or a 7970 being slow within the next 2 years. Sure if you do MSAAx8 and crazy effects in a poorly designed game, it will run slow.

Don't wait on the CPU side of things though. Ivy is going to be focused on mobile stuff with better integrated GPU and lower power. It probably will perform the same as Sandy at stock.
 

iNvid02

Member
If your gonna go with one card just get the top end one and your set for a few years.

I got a 570 but was not happy with it (mostly because of my resolution) and ended up getting another before trading both in for 580s.
 

Zzoram

Member
Whether you intend to get a Radeon 7900 series or GTX 600 series, wait for the GTX 600 series to come out. It will likely cause 7900 series cards to drop a bit in price.

AMD cards have been winning the performance for price ratio for several years now, but nVidia's top card typically wins on performance (although it comes out months after AMD launches a new top card).
 
I suppose I'll wait, only a few months from now. Thanks people.

Whether you intend to get a Radeon 7900 series or GTX 600 series, wait for the GTX 600 series to come out. It will likely cause 7900 series cards to drop a bit in price.

AMD cards have been winning the performance for price ratio for several years now, but nVidia's top card typically wins on performance (although it comes out months after AMD launches a new top card).
Its going to be a brand new PC, so I don't mind the price so much. Although waiting for it price drop when Nvidia releases and going multi-gpu won't be such a bad idea either.
 

Zzoram

Member
Honestly any top line GPU you buy will last you years.

I'm still running my Radeon HD4870 from 2008 happily. Sure I can't max out the newest games, but I can still play most games on High, other than the latest first person games, which I can still play on Medium.

It helps that I use 1680x1050 resolution on a 22" monitor. People who went for 1920x1200 have had to upgrade more often to maintain performance.
 
Honestly any top line GPU you buy will last you years.

I'm still running my Radeon HD4870 from 2008 happily. Sure I can't max out the newest games, but I can still play most games on High, other than the latest first person games, which I can still play on Medium.

It helps that I use 1680x1050 resolution on a 22" monitor. People who went for 1920x1200 have had to upgrade more often to maintain performance.
I think the most important thing for me would be the drivers. I don't mind tweaking and stuff (in fact I love it) but I want to be able to buy a new game and run it without issue on day 1. So I think Nvidia will be my choice.
 
Kepler finally revealed? seems reasonable

http://en.expreview.com/2012/02/06/entire-nvidia-kepler-line-up-unearthed/20836.html

NVIDIA-600-2.jpg


Very weird/skimpy RAM sizes though.

Edit: on second thought these specs look pretty odd to me, guess we'll see April 12.
 
Hmmm, so the 690 is dual 670s? But it costs just as much?

Also them prices :(, 250 for 560 ti - 400 for 660 ti. ( if true )

Pricing looks good to me, they're alleging GTX 580 performance for $319 with GTX 560? Sounds good to me.

Right now I'm kind of disinclined to believe that chart though. We'll see.
 

JaseC

gave away the keys to the kingdom.
I hope the Aussie prices aren't too inflated. A 670 for ~$550 would be nice, although I question the longevity of 1.75GB VRAM when aligned with my long-gestating desire to purchase the Dell U2711 or its future equivalent.
 

Maxrunner

Member
why do people buy top end amd cards over nvidia ones?

im genuinely interested, because from what i've seen nvidia has better drivers resulting
in better performance, better antialiasing support and compatability with numerous profiles,
and they seem to outperform the amd cards whilst costing around the same.

i loled....
 

artist

Banned
Charlie's take on GK104's die size;

How big is the Kepler/GK104 die?
Bigger than we thought, smaller than Tahiti

There are a bunch of rumours floating about GK104/Kepler, but nothing concrete on the physical chip. Luckily, a few days ago, some kind soul showed SemiAccurate a die.

Short story, our favourite ruffled valley mole (Indianfoodus betterthaninminneapolus) didn’t have any callipers handy, so we had to eyeball it with a ruler. The chip we saw is roughly 18-19mm * 18-19mm, putting the range from 324-361mm^2. If you assume it is both square and in the middle, 342mm^2 wouldn’t be a bad number to plug in to the spreadsheets.

While this is a lot bigger than we were expecting, it is still smaller than Tahiti by an appreciable margin. If performance ends up about where we have been told, it will be about as efficient as AMD’s current chips at a lot lower power draw. If the price holds up too, GK104 will own the mid-market, aka where the money is.

Net result, if Nvidia can get it to yield, power use stays where it is now, as do clocks, and they don’t get greedy with the price, well, three of four isn’t a bad tally. Seriously though, assuming no major changes, it looks like a decent part, but lets wait to see what comes out, and more importantly, when. Late March is the current best case, assuming there is no A3. More when we get it.
He's gone from awesome performance to peaky performance and really small to slightly small. It seems like Nvidia finally caught up with Charlie's source and could possibly be making a fool out of him.
 

Mr Swine

Banned
So the 680 is only 45% faster than the 580? Doesn't seem that much, going to wait for the generation after so it will be 100% faster than my 580
 

artist

Banned
So the 680 is only 45% faster than the 580? Doesn't seem that much, going to wait for the generation after so it will be 100% faster than my 580
Umm ..

1. That table is not accurate.
2. It says 45% faster than 7970 which puts it roughly at 90% faster than 580.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
So the 680 is only 45% faster than the 580? Doesn't seem that much, going to wait for the generation after so it will be 100% faster than my 580

The chart is clearly false.....but even if it was right, the 680 according to it IS 90% faster than a 580.
Sooooooooo
 
Different strokes for different folks. I'm not good, but I like to game with "competitive"-grade equipment - so 120 FPS / 120 Hz / Vsync / 1080p would be ideal. But I've mostly given up on that with the games I want to play (especially since SLI/CFX add micro-stuttering and other issues and are pretty much required for that level). Online FPS MP at lower than 60 FPS doesn't work for me. Too much jank.

Now even 60hz isn't good enough? Sheesh. I used to play DOOM 2 over a 14400 baud modem with atrocious ping times (nobody even talked about such things back then) on a machine that I had to reduce the size of the screen in order for it to play anything that resembled smooth. And I owned. :p

Get off my lawn!
 

artist

Banned
http://semiaccurate.com/2012/02/07/gk110-tapes-out-at-last/

GK110 tapes out at last
Big Kepler is big, long cat is longer

With the size of GK104 now pretty settled, what about the big one? Sources are now saying this chip might be called GK110, but we are still hearing some insiders say GK112. Lets stick with GK110 for the article though, numbers ending in 0 tend to soothe moles more than numbers ending in 2.

Recently, a Taiwanese dancing mole, hair died green and still a bit hung over from New Year parties, gave us the answer. He said that GK110 is basically reticle limited, about 23.5mm on a side. The math says that it is about 550mm^2, with the last two generations coming in at 529/550mm^2 (GF100/GF110 respectively) and 576mm^2 for GT200.

The reason for the vagueness it that the chip just taped out a few weeks ago. Current green-topped roadmaps not from green topped moles have the release date slated for August/September. While this is quite possible, several of SemiAccurate’s sources with prior experience bringing high performance silicon to market scoffed at the time tables. Either way, late Q3 2012 is a decent mental placeholder.

One thing the mole said when he looked up with somewhat bleary eyes is that the chip definitely has a 384-bit memory bus. Other documents have the part burning close to 300W, and you can read two things in to this number. First is that this part is very likely to be an HPC oriented chip with the CU count, and attendant power-hungry interconnect, being notably higher than GK104. Second is that the rumours of GK100 being cancelled early in the game are likely true, they said problems revolved around interconnects and power issues.

Will GK110 solve these, and therefore Denver’s similar problems? Will it pull ‘only’ 300W, or will we get another Fermi-esqe performance per watt waterfall? Will GK110 be skewed toward HPC like the early leaks suggested? Will it once again come at the cost of gaming performance? Will there even be a consumer variant, or will it be professional/compute only? Some of these questions can be answered when silicon comes back. Others are a little more subjective. No matter what happens, this will be fun to watch.
 

artist

Banned
Why.... why on earth is his writing like that? Is it funny, or am I just missing something?
Its funny
in his head.

On topic, I think I'm a bit bummed with the 384b memory bus more than anything. Dont mind 300W for the top dog as long as its got the perf/W that Nvidia promised.
 

Corky

Nine out of ten orphans can't tell the difference.
Its funny
in his head.

On topic, I think I'm a bit bummed with the 384b memory bus more than anything. Dont mind 300W for the top dog as long as its got the perf/W that Nvidia promised.

So you think they ditched the 512b one? Is that what he implies in his "it got canceled" statement in the article?
 

artist

Banned
So you think they ditched the 512b one? Is that what he implies in his "it got canceled" statement in the article?
Dont know if GK100 was intended as a 512b design .. I expected that the top dog (GK100/GK110/GK112) in Kepler would be 512b.
 

artist

Banned
August? I guess I might as well buy a 7950 soon then.
I'm guessing that's Charlie time, which includes a respin. As from my calculations, if it taped out a few weeks ago and barring major issues and no respin required, worst case scenario would be a June launch.
 

pestul

Member
I'm guessing that's Charlie time, which includes a respin. As from my calculations, if it taped out a few weeks ago and barring major issues and no respin required, worst case scenario would be a June launch.

A June delay for the GK104 would have AMD rubbing it's hands together and laughing maniacally.
 

artist

Banned
A June delay for the GK104 would have AMD rubbing it's hands together and laughing maniacally.

I was talking about GK110\GK112. GK104 taped out long time back & going by official statements, it should be definitely out by mid-April.
 

artist

Banned
Some rumored specs:

GK104
1536 Cuda Cores
128 TMUs
32 ROPs
950MHz Core Clock
160GB/s memory bandwidth

Expected performance beween the 7950 and 7970.
 

artist

Banned
Time for an update ..
GK104 pops up in the wild
Kepler is in the hands of the AIBs

GK104/Kepler cards are now floating outside of NV’s orbiting headquarters, and have been landing all over recently. The short story is that the companies that need to have them do, or will really soon.

Reports coming in from the far east say that those high up in the priority list started getting Kepler cards in various guises early this week, possibly late last. The number of sightings from sources that SemiAccurate trusts has been going up almost exponentially over the past few days, and will probably keep doing so for a bit.

The short story is that some places have been getting early variants, others later, and in various states of functionality. Since they are meant for early hardware design and testing, they are more than adequate for the task. The lack of polish seems to indicate that Nvidia is pretty hell-bent on getting cards out the door ASAP.

If things go as normal, it takes 4-6 weeks from AIB sampling to cards on the shelves. This would mean late March or early April, just like we have been saying for weeks. The ball is rolling, and the variables are being narrowed quite quickly.
 
Top Bottom