• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia GTX 6xx/7xx Launch Thread of Swirling Kepler Rumors

Does any game use more than 2GB of data yet?

Would be funny if Nvidia release their card, matchs or excceds the 7970, and is cheaper.
It's like I'm in 2009 again :p
(or in 2003 before that)

People who bought high end 512 MB vram gfx cards in 2008-2009 got screwed over in the long run, especially those who bought 2 gpus , but single gpu users like me did too.


You can safely bet on games using more and more, possibly with a big boost in a year or so when the next consoles arrive.

Unless you plan to upgrade every single year, you'll regret it.
Just like me going 'oh the 512MB hd4870 will be enough and I'll save 20 euros' a few years back.
+ once the vram does become a problem it takes a big dump on the resale value of your card.

I bought a tie me over 1GB card (6870) when bf3 released , expecting way more at a better price from AMD for this year and that's already bottlenecking me in sonic generations and bf3 even at 1600x1200, but it was cheap and I'll replace it within a year so...
 

eastmen

Banned
What's so fucking stellar about the 7970 at €480 ($610)? I don't get it, I paid €280 for my 4870 back when it released. Does AMD see a world where people are still making money hand over fist?

Neither are the 580's dropping in price over here. I need to build a new computer, but I really don't know what GPU I should go for.



Its the fastest single gpu card , it offers better performance and uses less power than the gtx 580s and was launched at the same price point. The question you should be asking is what was so steller about the gtx 580s


SneakyStephan said:
Would be funny if Nvidia release their card, matchs or excceds the 7970, and is cheaper.

It's like I'm in 2009 again :p
(or in 2003 before that)

People who bought high end 512 MB vram gfx card buyers in 2008-2009 got screwed over in the long run, especially those who bought 2 gpus , but single gpu users like me did too.


You can safely bet on games using more and more, possibly with a big boost in a year or so when the next consoles arrive.

Unless you play to upgrade every single year, you'll regret it.
Just like me going 'oh the 512MB hd4870 will be enough and I'll save 20 euros' a few years back.

I bought a tie me over 1GB card (6870) when bf3 released , expecting way more at a better price from AMD for this year and that's already bottlenecking me in sonic generations and bf3 even at 1600x1200, but it was cheap and I'll replace it within a year so...


People forget that AMD does Eyefinity with a single card .

5760x1080p is 6,220,800 pixels vs 2,073,600 . So your going to run out of ram faster than at 1080p
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
REALLY??!!

I just built my rig, one 3gb 580 evga and a 980x extreme edition, all stock and every game I played thus far at 1080p (Bf BC2, NFS HP) were all over 80 fps average and that is max everything and having 4msaa and 4xssa applied in nvidia control panel. I have yet to play Batman and BF3 but you having TWO gtx 580's and your sayin the performance is horrible makes NO sense. This contradicts everything I have read about a setup like yours/

What cpu are you running? what resolution ??

Keep reading the thead. I stated my CPU and I said 1080p. Also, BF BC2 and NFS HP aren't graphically demanding at all. I could max out BF BC2 when it came out with a 295GTX. Moreover, other people with SLI GTX580 1.5GB have corroborated that it won't max out BF3 at 1080p. BF3 is very VRAM hungry. Maybe your 3GB card will max it though.

It doesn't. That's why I'm wondering if he's playing above 1080p. Only map on BF3 that gives me "trouble" is Oman. And even then my FPS is in the 70ish range (everything Ultra).

You have a 3GB VRAM GPU. BF3 on Ultra requires over 2GBs of VRAM. You have no experience with my setup, each card has 1.5GB. As I said, for what they cost me, my GTX580 SLI setup is disappointing and I feel Nvidia saw it too and released the 3GB version of the GTX580, but that was about 7 months after I purchased mine. I am not happy with my SLI setup and I will sell the cards when Nvidia puts out a worthy successor, nothing will convince me otherwise.
 

Smokey

Member
You have a 3GB VRAM GPU. BF3 on Ultra requires over 2GBs of VRAM. You have no experience with my setup, each card has 1.5GB. As I said, for what they cost me, my GTX580 SLI setup is disappointing and I feel Nvidia saw it too and released the 3GB version of the GTX580, but that was about 7 months after I purchased mine. I am not happy with my SLI setup and I will sell the cards when Nvidia puts out a worthy successor, nothing will convince me otherwise.


Makes sense. I can see where you're coming from if you have the 1.5GB versions.
 

dr_rus

Member
Neither are the 580's dropping in price over here. I need to build a new computer, but I really don't know what GPU I should go for.
You should wait a couple of months. The new generation of videocards haven't really started yet. 7970 is just AMD trying to milk some money from early adopters.
 

eastmen

Banned
You should wait a couple of months. The new generation of videocards haven't really started yet. 7970 is just AMD trying to milk some money from early adopters.

by offering a better product for the same money as a competitor ?

the 7970 is

The same price as the 3gig gtx 580s
Faster than the 3 gig gtx 580s
uses less power than the gtx 580s

I don't think you can call this milking . Obviously if someone can wait they should , but the 7970 is not a bad card , esp if you don't mind over clocking , at some points it can be 80% faster than a 580 according to hardocp
 
Keep reading the thead. I stated my CPU and I said 1080p. Also, BF BC2 and NFS HP aren't graphically demanding at all. I could max out BF BC2 when it came out with a 295GTX. Moreover, other people with SLI GTX580 1.5GB have corroborated that it won't max out BF3 at 1080p. BF3 is very VRAM hungry. Maybe your 3GB card will max it though.



You have a 3GB VRAM GPU. BF3 on Ultra requires over 2GBs of VRAM. You have no experience with my setup, each card has 1.5GB. As I said, for what they cost me, my GTX580 SLI setup is disappointing and I feel Nvidia saw it too and released the 3GB version of the GTX580, but that was about 7 months after I purchased mine. I am not happy with my SLI setup and I will sell the cards when Nvidia puts out a worthy successor, nothing will convince me otherwise.

Not true at all, please don't spread misinformation. 1.5 GB vram is enough for Ultra BF3 @ 1080p. As stated before BF3 uses dynamic VRAM caching and allocation. I ran Ultra BF3 @ 1080p with 1280mb VRAM without hitting my VRAM limit.

BF3 on Ultra does not use over 2 GB VRAM, lol. Even on my 2.5 GB GTX 570 it won't go over about 1.5-1.6 GB VRAM usage at 1080p. Maybe if I was at 1440p or 1600p.
 

tokkun

Member
by offering a better product for the same money as a competitor ?

the 7970 is

The same price as the 3gig gtx 580s
Faster than the 3 gig gtx 580s
uses less power than the gtx 580s

I don't think you can call this milking . Obviously if someone can wait they should , but the 7970 is not a bad card , esp if you don't mind over clocking , at some points it can be 80% faster than a 580 according to hardocp

The 580 is a year old and the 3GB part is treated as more of a luxury version. In terms of placement in the lineup, the 7970 is more comparable to the 1.5GB 580. Yes, the 7970 is a better value than the 580, but only because prices on the 580 have been almost completely frozen since its introduction.

In pretty much any other generation, it would be astonishing that there would even be a question of whether a new flagship single-GPU card provided better performance per dollar at its launch than the previous generation did at its launch. Yet the 7970, overclocking aside, is really no better in terms of price-to-performance than the 6970 was when it was released. Now, I'll grant that the 7970 and 6970 were released in two different price classes. Maybe the 7870 comes out at $350 and delivers 30% more performance than the 6970 and order is restored to the universe.
 

Sethos

Banned
You know what I hope they add? Ability to hook up 3 monitors to a single card. The 2 monitor limit is killing me.
 

Icelight

Member
You know what I hope they add? Ability to hook up 3 monitors to a single card. The 2 monitor limit is killing me.
I wish they'd let the GPUs downclock significantly when you have multiple monitors attached to a single card more than I want triple monitor support :\

It annoys me to the point where I don't use 2 monitors during the summer because my GPU runs flat out...and using Afterburner you can only downclock so far for 2D mode (so my GTX 570 still runs at 50C or so, compared to the 37 - 40C when in a 'full' downclocked state)

I mean, I understand if they can't clock the GPU all the way down to the 50/100 Core/Mem state when using 1 monitor, but they should have some branch in their drivers to clock it down to a level such that it's as low as possible while still retaining enough bandwidth to feed two monitors.

Although I guess that's more of a driver thing than a hardware/architecture thing...
 

Hellish

Member
other people with SLI GTX580 1.5GB have corroborated that it won't max out BF3 at 1080p. BF3 is very VRAM hungry. Maybe your 3GB card will max it though.

You have a 3GB VRAM GPU. BF3 on Ultra requires over 2GBs of VRAM. You have no experience with my setup, each card has 1.5GB. As I said, for what they cost me, my GTX580 SLI setup is disappointing and I feel Nvidia saw it too and released the 3GB version of the GTX580, but that was about 7 months after I purchased mine. I am not happy with my SLI setup and I will sell the cards when Nvidia puts out a worthy successor, nothing will convince me otherwise.


I can max out BF3 on ultra with SLI GTX 480's it seems you have some other issue, they are also only 1.5GB.
 

Dennis

Banned
You know what I hope they add? Ability to hook up 3 monitors to a single card. The 2 monitor limit is killing me.

I am using a two monitor setup, my main screenshot-taking 2560x1600 monitor and my secondary, "baby" 1920 x 1200 monitor. I am thinking about getting two more 1080p monitors to max out my available ports.

Another reason to go SLI!
 

Datschge

Member
I really really hope that the next couple of generations of ultrabooks can get better gaming performance.

Ultrabooks are Intel's latest PR stunt to accelerate sales of their average parts at a premium price. Until they significantly improve their graphics logic and drivers their gaming performance won't compare to those offered by Nvidia and AMD.
 

artist

Banned
People forget that AMD does Eyefinity with a single card .

5760x1080p is 6,220,800 pixels vs 2,073,600 . So your going to run out of ram faster than at 1080p
iHtW4X8WC89RO.PNG
 

iNvid02

Member
I will never, ever own a multi-GPU setup ever again, it ruined my passion for gaming for over a year.

what cards did you have?

i have SLI 580s and have never had any problems with microstutter unless the game is borked itself e.g deus ex.
 

sk3tch

Member
what cards did you have?

i have SLI 580s and have never had any problems with microstutter unless the game is borked itself e.g deus ex.

Some people are just more sensitive to it than others. I've got 580 SLI, currently. I get micro-stutter...albeit WAY better than my 6970 CFX.
 

Theonik

Member
Hopefully with PCIE 3.0 cards we should see less microstutter, no? Always thought it was a bandwidth problem than anything.
 

Theonik

Member
It isn't a bandwidth issue at all, and we're not even using half the bandwidth of PCI-E 2.0. Marketing disinformation FTL.
Yeah, I realised my mistake. I remember reading something along those lines a long time ago. 8800 Ultra being the king, long time ago. lol
 
Based on the 'Kepler may launch early in Feb' rumor that Tom's posted up ( m.tomshardware.com/news/Nvidia-Kepler-GTX680-GPU-geforce,14499.html), does anyone else see this being a decoy product to steal sales from the 7970/7950 while Nvidia preps a beefier version for launch in March/April? Nvidia has done this many times before when they were pressured to compete.
 

tokkun

Member
The rumors from SA and fudzilla both said April. Rumors are rumors of course, but I would put more stock in them than Chiphell.
 

CaLe

Member
Can't wait to fucking ditch ATI / AMD and their craptastic drivers in favor of Nvidia.... Biggest mistake I did was go the ATI / AMD route and I regret it with every game I play.
 
Time for a little update...

Check out this rumor with a leaked spec sheet.

600x316px-LL-9f3bdbbf_jbvizJakv5UDew.png


http://www.chiphell.com/thread-338350-1-1.html

Some of it may be fake, but if those are real specs or close to it... wow. Kepler could be insane.

But I really don't see how GK104 could only have 186 GB/s memory bandwidth and handle 1536~ CUDA cores, especially when the GTX 580 struggles with a 384-bit bus and 512 CUDA cores... unless they're making the CUDA cores more diversified like AMD's shader processors and dropped their hot clocks. If so, that makes sense. CUDA cores may not be able to compared vs Fermi CUDA cores.

Another interesting rumor:

What you can find on Chinese forums nowadays is a very fresh leaked news and photographs likely from production lines. User of Diybbs.zol.com.cn forums has posted a picture of Kepler GK-110 core, which is probably GeForce GTX 680.

According to the picture, upcoming flagship model of GeForce 600 Series is codenamed GK-110-485-A2. Pictured graphics processing unit was made in 50th week of 2011, which means it is very fresh. No other information was posted. NVIDIA is very careful with leaking the news. AMD’s specifications were known months before realease of HD 7000 Series, while NVIDIA gpus are a mistery. Whole knowledge is based on predictions and presentations from NVIDIA’s itself. Everyone was expecting NVIDIA to release, at least very basic, info about their new models during CES 2012. Unfortunately that did not happen.

The latest news suggest that first Kepler based graphics cards may hit the stores in February or March. SemiAccurate have already confirmed that upcoming GeForce series will be more powerful than Radeon HD 7000, well at least for their flagship models.

8Mu36.jpg


Man, I really hope Kepler is faster than the Radeon 79xx's so the 28nm GPU wars can go full throttle.
 

dLMN8R

Member
Hopefully these rumors of a significant advantage over AMD are true. I have no alligence to either company, happily switching between them in the past, but I'm about done with AMD's complete lack of developer outreach.

Call it "bribes" or call it "being proactive", I don't care how NVIDIA always manages to get first-tier support out of the latest games with AMD almost always having more issues, NVIDIA does what they need to do to almost always provide a better experience for customers with their cards.

From my 9600 Pro to my 4870 and now my 6850, it always seems like I could be getting a better overall experience out of a similarly-priced NVIDIA card based on everything I read. I don't expect any difference with this new series, and that's where my dollars will probably go.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Hmm obr hardware? Never heard of it, let me give it a read:
Do not buy slow Radeon with Tahiti chip now (for that prices), Kepler is completely decimate them and worth the wait, Radeons will be really cheap in April. As I said long ago, middle-class kepler GeForce card easily kills the high-end HD 7970. Really looking forward to April, mainly on the faces of people in AMD :)
Hmm?? Seems legit.
 
That stuttering in Deus Ex is an issue with the game.

Not drivers...the game

I have an i72600K, 8GB RAM, and a 560TI. I've lowered the settings to the point of being even below medium, and still get the stutter. Game needs a serious patch.
 

kitch9

Banned
Hmm obr hardware? Never heard of it, let me give it a read:

Hmm?? Seems legit.

There's lots of bullshit and mudslinging, and lots of sites are just reposting each others news who in turn have been fed bullshit.

2 weeks after CES Nvidia starts leaking info on a card that destroys the competiton that they will charge a price that is too good to be true and we are supposed to believe its all completely true?

I don't think so.
 
That stuttering in Deus Ex is an issue with the game.

Not drivers...the game

I have an i72600K, 8GB RAM, and a 560TI. I've lowered the settings to the point of being even below medium, and still get the stutter. Game needs a serious patch.

It's that shitty engine they used for it. Wasn't made for FPS.
 

dark10x

Digital Foundry pixel pusher
That stuttering in Deus Ex is an issue with the game.

Not drivers...the game

I have an i72600K, 8GB RAM, and a 560TI. I've lowered the settings to the point of being even below medium, and still get the stutter. Game needs a serious patch.
They already patched it once and it massively improved performance. I see virtually no stuttering of any kind with the game now. It's buttery smooth.

It was pretty bad when the game first launched, however.
 

artist

Banned
I see that I hurt your feelings in that other thread, but I'm genuinely curious as to why you think I'm an Nvidia fanboy.
I would like to know first as it was you who took the first shot.

So ~300mm^2 chip in cards for $300 is too good to be true? Then all the 4800s, 5800s, 460s, 560s selling for $300 are too good to be true, right?
I wonder who's the shill there...
It probably refers to the "AMD doomed" talk. GK104 may very well be priced at $300 but the possibility that it could beat the 7970 handily and AMD 7970 owners getting salty, future 7970 owners holding off their purchases (much like what I'm saying here, uh oh!) is probably suspect.

The fact that all of these posts coming from newly signed up members and from the same IP subnet is whats wrong.
 

WarMacheen

Member
Way faster, I dont know why people are thinking these 7xxxx are such beasts, when they compare it to a 580 3Gb, the 7970 has a huge clock advantage, it should be compared against the 580 classy ultra with the 900 on the core, which also recently had a price drop to $550,

I wish someone would bench the 2 and compare
7970 vs 3GB 580 Classified Ultra
@Stock
@Max OC
and Clock for Clock @ 925mhz.


I dont think the 580 classy would be all that slower while on an older architecture



You really have no idea.
 

WarMacheen

Member
So ~300mm^2 chip in cards for $300 is too good to be true? Then all the 4800s, 5800s, 460s, 560s selling for $300 are too good to be true, right?
I wonder who's the shill there...



$299 for a card that can beat a 7970 is the part that was in question.
 

artist

Banned
You really have no idea.
Yeah, why would anyone drop clocks to compare architectures? I mean the Fermi architecture already has it's cores running at almost twice the speed of AMD's, why not drop them by half and compare it?

You are correct, he really doesnt have a clue.
 

Hellish

Member
You really have no idea.

Enlighten me how this would be a bad comparison, seems it would show the the real difference of the cards. Both are $550 Cards (580 Classy Ultra & Reference 7970), its not an architecture comparison I am looking for it is an in game performance comparison of 2 550 dollar cards

Yeah, why would anyone drop clocks to compare architectures? I mean the Fermi architecture already has it's cores running at almost twice the speed of AMD's, why not drop them by half and compare it?

You are correct, he really doesnt have a clue.

Where did I say anyone is dropping a clock?

Both Cards at stock? No
Both Cards at max oc? No
Botch Cards at 925? Again No


... You are in correct, you can not read.
 
Pretty sure [H] compared a Oced 7970 to a Oced 580...pretty sure the 580 got dusted. Especially since 7970's typical can go upwards of 1100mhz and scale really well.

Your "clock for clock" comparison isn't really relevant.

Also about Nvidia having double clocked shaders, big whoop because they have way less of them (512 in 580 vs 2048 in 7970). They need the double clock just to be competitive, but they still have a teraflops deficit.
 

artist

Banned

goodfella

Member
That stuttering in Deus Ex is an issue with the game.

Not drivers...the game

I have an i72600K, 8GB RAM, and a 560TI. I've lowered the settings to the point of being even below medium, and still get the stutter. Game needs a serious patch.

Try windowed mode.

No need for thanks.
 

prophecy0

Member
Try windowed mode.

No need for thanks.

For some reason windowed mode didn't work for me. This is the only thing that fixed it:
http://www.neogaf.com/forum/showpost.php?p=34483305&postcount=19559



And now, to stay on topic, I hope Kepler is freaking awesome. I currently have a 570 that does a damn good job on most everything, but I want to play the newer games with maxed settings and maintain 60+ fps. At the very least I hope Kepler forces AMD to drop the price on the 7970.
 
Top Bottom