• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia GTX 6xx/7xx Launch Thread of Swirling Kepler Rumors

TheExodu5

Banned
Well, to be fair, I have the microstutter problem on my GTX 460 for Deus Ex: HR. This is the first time I've ever experienced it. It really does make a 60fps game seem like it's running at a junk framerate and I'd quit PC gaming if this was the norm. That's some messed up incompetence whatever causes it.

That's not microstutter...that's just plain stutter that is caused by a bad engine. Turn on DX11 and the problem should go away. It only happens to me in DX9 in Deus Ex.
 

iNvid02

Member
im playing it now and dxhr is borked, random fps drops in dx11 or dx9, plus it doesn't utilise all of my gpu either.

read their forums, they still haven't fixed it entirely. dx11 gives better performance, and ssao high is more stable than when on normal.
 

TheExodu5

Banned
I turn of SSAO entirely because of the terrible mouse lag it introduces.

Their triple buffering implementation is also poor since it's evidently queuing frames, causing a lot of mouse lag. I don't know why so few engines implement triple buffering properly...drives me nuts.

And before anyone tag quotes me, d3doverrider forces the input lag inducing implementation of triple buffering, so it doesn't help.
 

x3sphere

Member
Let me count the ways:
-prototype shadows still borked completely after all these years, never fixed...
-skyrim doesn't scale at all
-saint's row 3 super low fps
-homefront unplayable performance degrading even on lowest setting
-rage (well rage had stutter problems on both platforms, nevertheless)
-deus ex HR stutter
-witcher 2 drunkmode slowdown, was never fixed on hd 48xx series! only fixed on the newer cards (the usual fuck you from amd for 'legacy' support if you can call 3 years legacy)
+microstutter in a miriad of games even on single gpu.

There's more but these are the ones that come to mind.
Amd has always had a bit more problems than nvidia but lately it completely takes the piss.

Then there's the fact that nvidia has way more awesome supersampling AA options and way better SLI support.

My old radeon 9800 pro was released like 2 years before vista was and it NEVER got vista drivers, ever.
You always had to run some broken xp driver and the 9200 cars don't work at all.

Amd is terrible with support, they forget their hardware exists within 2-3 years after release, you can count on that from my experience.

Microstutter doesn't exist on single GPU. Whatever you're seeing in a myriad of games is not microstutter. You're mistakening it for HDD lag or pushing games above your VRAM capacity.

People expecting 60 fps minimum in every game with all settings max are just setting themselves up for disappointment, I don't care if you got dual flagship gpus there will be scenarios where they will buckle and go under 60 , and I certainly don't just mean due to a game's technical/visual prowess...

This. Buying the latest and greatest has never given a constant 60 FPS, doesn't matter what GPU generation. In many cases CPUs are still the limitation too when trying to hit 60 FPS minimum.

The next GPU from Nvidia isn't going to be an endgame card, even in SLI I have no doubt it'll dip below 60 FPS here and there in today's titles. Moreover, buying dual GPU has never been a good value. You always get more bang from your buck going single GPU and it comes with none of the problems.
 

Smokey

Member
Agreed. Unless there is actually any real information this thread should be locked. This is just a desperate grab at trying to land the OT before anyone else can, or a pathetic attempt to try to take attention away from the 7970.

Umm...BP101 has been touting and praising the 7970 more than anybody.
 
This. Buying the latest and greatest has never given a constant 60 FPS, doesn't matter what GPU generation. In many cases CPUs are still the limitation too when trying to hit 60 FPS minimum.

The next GPU from Nvidia isn't going to be an endgame card, even in SLI I have no doubt it'll dip below 60 FPS here and there in today's titles. Moreover, buying dual GPU has never been a good value. You always get more bang from your buck going single GPU and it comes with none of the problems.

And this will ALWAYS be the case. Developers will keep pushing hardware and there will always be settings in a game or scenarios in a game that will bring down your fps to under 60 fps (no matter how powerful your setup is). This is a never ending cycle.
 

pestul

Member
I mostly prefer AMD cards, but I hope Kepler comes out soon for good competition. These prices are just too damn high. With AMD launching the 7950 later this month, Nvidia better get their act together... end of March still seems too far away. God forbid they slip to June or later. :S
 

SoulClap

Member
I hope you'll be able to run three monitors with a single card this time. Ability to bitstream HD audio would be nice as well.
 
Umm...BP101 has been touting and praising the 7970 more than anybody.

Yup, true that. And Kepler NEEDS to be a beastly generation of cards so we can have ourselves a 28nm price war, because it's best when competition is so fierce that consumers have to make a hard choice. between great products. 7970/7950 might be beast cards, but if Kepler comes in 2-3 months and is superior, I might just have to sell whatever I've got and grab one.

I personally hope (but don't fully expect) Kepler really is 20-40% faster than 79xx series and also OC's like a beast so AMD goes 'oh shit' and has to do huge price drops.
 

Gav47

Member
Whats the deal with the series name. Are they breaking up the mobile and desktop lines like they did for the GTX3xx/4xx series?
 
Their triple buffering implementation is also poor since it's evidently queuing frames, causing a lot of mouse lag. I don't know why so few engines implement triple buffering properly...drives me nuts.

The consoles have neither the VRAM nor the grunt to do triple-buffering, and most game engines are made with consoles in mind now. It should surprise nobody that triple-buffering is essentially a lost cause at this point.

Fortunately most console ports run locked at 60fps anyways on my machine, so I just turn Vsync on.
 

ACH1LL3US

Member
BF3, Batman Arkham Asylum DX11 (PhysX OFF), Saints Row The Third DX11, Crysis 2 with DX11 patch and High Res textures. Those are just off the top of my head. 2 way SLI GTX580s are a joke for how much money they cost me. I got the first one in December of 2010 and the second on March 2011. I paid $580 and $550 respectively for two 1.5GB EVGA cards which I keep OC'd to 820. It has barely been one year and the setup is struggling with the latest games worse than any other multi-GPU setup I ever had. I am upgrading as soon as Nvidia releases their successor.



REALLY??!!

I just built my rig, one 3gb 580 evga and a 980x extreme edition, all stock and every game I played thus far at 1080p (Bf BC2, NFS HP) were all over 80 fps average and that is max everything and having 4msaa and 4xssa applied in nvidia control panel. I have yet to play Batman and BF3 but you having TWO gtx 580's and your sayin the performance is horrible makes NO sense. This contradicts everything I have read about a setup like yours/

What cpu are you running? what resolution ??
 

ACH1LL3US

Member
It means I cannot get 60fps constant. Also, I am very nitpicky when it comes to my PC gaming. On the consoles I am willing to put up with graphical issues, but when I dump over 1k on videocards, I expect to make no compromises and the sad truth is that I have to. Also, I find it unbelievable that you are experiencing 60fps constant in those games. Maybe you are just oblivious to fps drops. In Saints Row The Third, which is the least graphical intensive game of the ones I mentioned, I can easily experience fps drops when you hit a gang and cars and explosions start piling up. BF3 stays at 40-50fps with drops even down to 30fps when I play on a 32v32 server. I had to dial back to high for this game.

Again, what resolution are you running and settings in BF3?? Need to know the CPU too, it sounds like somthing isnt right here :/
 

ACH1LL3US

Member
Yeah, there is no way that BF3 is GPU limited with two 580s at 1080p.

The only thing I can think of is if he is running it on Ultra, that even at 1080p BF3 can use more then 1.5gb of vram, so if he doesnt have the 3gb variant of the the 580 he would have the game grabbing memory off the HDD when exceeding 1.5gb causing stutter, frame rate issues etc.

I can't fathom at 1080p bf3 bringing a sli 580 setup to its knees :/
 

Smokey

Member
Yup, true that. And Kepler NEEDS to be a beastly generation of cards so we can have ourselves a 28nm price war, because it's best when competition is so fierce that consumers have to make a hard choice. between great products. 7970/7950 might be beast cards, but if Kepler comes in 2-3 months and is superior, I might just have to sell whatever I've got and grab one.

I personally hope (but don't fully expect) Kepler really is 20-40% faster than 79xx series and also OC's like a beast so AMD goes 'oh shit' and has to do huge price drops.


Realistically I shouldn't have to upgrade my setup for years, but I'm trying so hard not to fall into this trap with the 7970s. Been doing good so far :p


The only thing I can think of is if he is running it on Ultra, that even at 1080p BF3 can use more then 1.5gb of vram, so if he doesnt have the 3gb variant of the the 580 he would have the game grabbing memory off the HDD when exceeding 1.5gb causing stutter, frame rate issues etc.

I can't fathom at 1080p bf3 bringing a sli 580 setup to its knees :/

It doesn't. That's why I'm wondering if he's playing above 1080p. Only map on BF3 that gives me "trouble" is Oman. And even then my FPS is in the 70ish range (everything Ultra).
 

ACH1LL3US

Member
Realistically I shouldn't have to upgrade my setup for years, but I'm trying so hard not to fall into this trap with the 7970s. Been doing good so far :p




It doesn't. That's why I'm wondering if he's playing above 1080p. Only map on BF3 that gives me "trouble" is Oman. And even then my FPS is in the 70ish range (everything Ultra).

Yea, I know these new cards from AMD and Nvidia will be better then my 580 BUT I just haven't come across a need to upgrade, I already am thinking on holding off on the second 580 as I am that satisfied with the performance. Keep in mind this is my first run with pc gaming since 2003, been console gaming since then so ANYTHING is a huge upgrade to me so far.
 

dark10x

Digital Foundry pixel pusher
The consoles have neither the VRAM nor the grunt to do triple-buffering, and most game engines are made with consoles in mind now. It should surprise nobody that triple-buffering is essentially a lost cause at this point.

Fortunately most console ports run locked at 60fps anyways on my machine, so I just turn Vsync on.
You do realize that triple buffering is extremely common on PS3, right? It's been used in games for years while it never (or rarely) shows up on XBOX360.
 
“Kepler” Features 256-bit and 2GB Memory

http://en.expreview.com/2012/01/14/kepler-features-256-bit-and-2gb-memory/20327.html

This is for the GK104 "Kepler" GPU.

hough we are unable to confirm the concrete specifications and how effective the performance of first-launched "Kepler" will be, well-informed source indicated that "Kepler" will feature 256-bit memory controller, its corresponding graphics will pack 2GB memory, have TDP of 225W. We infer the first product may be the GK104 "Kepler" GPU. Judged from the memory controller and memory capacity, it is supposed to be difficult to defeat AMD Radeon HD 7970 or even Radeon HD 7950, in addition, TDP of 225W doesn’t have much going for it compared with the competitor.
 
Way faster, I dont know why people are thinking these 7xxxx are such beasts, when they compare it to a 580 3Gb, the 7970 has a huge clock advantage, it should be compared against the 580 classy ultra with the 900 on the core, which also recently had a price drop to $550,

I wish someone would bench the 2 and compare
7970 vs 3GB 580 Classified Ultra
@Stock
@Max OC
and Clock for Clock @ 925mhz.


I dont think the 580 classy would be all that slower while on an older architecture

Different archs. Current nvidia gpu's have shaders at twice the core speed, while AMD gpu's run at the same speed than the core.

You don't want that, 580 is obsolete vs 7970 specially for OC. Max estable/24-7 OC with reference PCB on the HD7970 is 1300-1350Mhz~ and 1200-1250Mhz with stock vcore.

7970 is in a league of it's own.

Did I say that even at 1200Mhz the HD7970 consumes just 15w more? AMD has no competition right now, and they'll wait maaaaaaaaaaaany more months.
 

K.Jack

Knowledge is power, guard it well
Maybe. Wouldn't be a bad idea if they did.

The 300M series was completely stupid though, nearly disgraceful really. It hung Nvidia's mobile fans out to dry for nearly an entire year, stuck dicking around with 128 core G92 200M cards for the fastest things available, as the highest 300M card was a pitiful 96 shader 128-bit GDDR5 pos which no one cared about.

If they pull some shit like like with 600M, meaning no high-end parts this year, I'm completely done with the green team.
 
The 300M series was completely stupid though, nearly disgraceful really. It hung Nvidia's mobile fans out to dry for nearly an entire year, stuck dicking around with 128 core G92 200M cards for the fastest things available, as the highest 300M card was a pitiful 96 shader 128-bit GDDR5 pos which no one cared about.

Yup, 3xx series was a joke. But Nvidia has been fucking over the mobile space for a long time now. Almost everything has pretty much been a refresh of the 88xx mobile cards, other than the new 4xx/5xx high-end mobile cards.
 
Are Nvidia cards generally noisier? I got a Vapor-X 5870 specifically to keep things quieter and I've been quite happy with it. Wondering what options on the Nvidia side there are for keeping the dB down.
I'm not that desperate to upgrade now, perhaps later in the year. Is DX11.1 something that cards will have to specifically support? Might it be best to wait for that too?
 

1-D_FTW

Member
Are Nvidia cards generally noisier? I got a Vapor-X 5870 specifically to keep things quieter and I've been quite happy with it. Wondering what options on the Nvidia side there are for keeping the dB down.
I'm not that desperate to upgrade now, perhaps later in the year. Is DX11.1 something that cards will have to specifically support? Might it be best to wait for that too?

It's all dependent on the fans. Buy a card with a good heat sink and nice fan and it'll be nice and quiet. Buy something with a cheap heat sink and small, whirly bird fan and it'll be loud.
 

iNvid02

Member
Nvidia GeForce GTX 680 May Launch Ahead of Schedule

The world of technology is really a series of chess matches between various rivals, each one making moves based on a
playing board created by the other, all in an attempt to gain an edge and, if possible, declare checkmate (without running
afoul of antitrust laws, of course). Two of the bigger participants are AMD and Nvidia, and to counter AMD's recent Radeon
HD 7000 series launch, Nvidia may opt to release its upcoming GeForce GTX 680 graphics card a month early.

Chinese website ChipHell.com is reporting that Nvidia is pulling back the GTX 680's release from March or April and will launch
the card sometime in February. It's nothing more than a rumor at this point, though ChipHell forum member Napoleon has a pretty
strong track record when it comes to leaked information, VR-Zone says.


That said, the GTX 680 is supposed to offer similar performance to AMD's Radeon HD 7970, currently the fastest single GPU
videocard on the planet. The card's clockspeed is rumored to be 780MHz, and is said to have 2GB of memory.
It will be the
first card based on Nvidia's 28nm "Kepler" architecture.
 

sk3tch

Member
I don't even understand what Nvidia is thinking if those rumors are true. We've seen them do the VRAM "numbers game" before (3GB GT 555Ms on Alienware M14x, anyone? Pointless...esp. with it's max 900p screen lol...) - so why not toss out 3GB to match AMD, if nothing else? Seems like that'd be easy to change whereas all of their other specs seem due to yield/architectural problems...
 

Hazaro

relies on auto-aim
I don't even understand what Nvidia is thinking if those rumors are true. We've seen them do the VRAM "numbers game" before (3GB GT 555Ms on Alienware M14x, anyone? Pointless...esp. with it's max 900p screen lol...) - so why not toss out 3GB to match AMD, if nothing else? Seems like that'd be easy to change whereas all of their other specs seem due to yield/architectural problems...
Because 2GB is enough in most cases and it cuts costs. They don't feel pressured.
 
You guys realize why it'd be 2 GB, right? If the memory bus isn't an off number like 384-bit in the current GTX 580 and is rather a 256-bit bus or a 512-bit bus, they're going to match the memory banks to that. It's the same reason the 7970 has 3 GB of RAM and the 580 has 1.5 GB or 3 GB.

So if the 680/780 has a 512-bit memory bus... wow. With a 512-bit bus and 6+ Ghz GDDR5 RAM, that thing would have insane bandwidth for very high resolution gaming. And if true, they may release a 4 GB version later on.
 

x3sphere

Member
Does any game use more than 2GB of data yet?

Would be funny if Nvidia release their card, matchs or excceds the 7970, and is cheaper.

At 1600p, I go over 2GB in a lot of games.

Kind of silly to launch with only a 2GB part if that is the case, most people buying these cards will be gaming on multiple monitors or over 1080p. Hopefully, a 4GB variant is also available
 
What's so fucking stellar about the 7970 at €480 ($610)? I don't get it, I paid €280 for my 4870 back when it released. Does AMD see a world where people are still making money hand over fist?

Neither are the 580's dropping in price over here. I need to build a new computer, but I really don't know what GPU I should go for.
 

Hazaro

relies on auto-aim
What's so fucking stellar about the 7970 at €480 ($610)? I don't get it, I paid €280 for my 4870 back when it released. Does AMD see a world where people are still making money hand over fist?
Slot in the correct price bracket, receive money.
 

pestul

Member
What's so fucking stellar about the 7970 at €480 ($610)? I don't get it, I paid €280 for my 4870 back when it released. Does AMD see a world where people are still making money hand over fist?

Neither are the 580's dropping in price over here. I need to build a new computer, but I really don't know what GPU I should go for.

They are priced for the performance they offer versus the nearest competitor. Yeah, it's not fair, but they went that way this time. Perhaps when the entire line is out later this month/February things will change.
 
I really really hope that the next couple of generations of ultrabooks can get better gaming performance. Whether it be through a dedicated card or an onboard solution-- PC gaming has been hurting quite a bit because it used to be that when you bought a new computer it was guaranteed to run all of the latest games. Now there are computers being sold that can still barely run a 2007 game.

I know this thread is for the desktop stuff, but its not nearly as interesting these days and it won't be until the next generation of consoles roll out so that games are being developed for bleeding edge hardware again.
 

Hazaro

relies on auto-aim
I really really hope that the next couple of generations of ultrabooks can get better gaming performance. Whether it be through a dedicated card or an onboard solution-- PC gaming has been hurting quite a bit because it used to be that when you bought a new computer it was guaranteed to run all of the latest games. Now there are computers being sold that can still barely run a 2007 game.

I know this thread is for the desktop stuff, but its not nearly as interesting these days and it won't be until the next generation of consoles roll out so that games are being developed for bleeding edge hardware again.
It should get a lot better since there has been a switch to prioritize performance per watt.
 

pestul

Member
I really really hope that the next couple of generations of ultrabooks can get better gaming performance. Whether it be through a dedicated card or an onboard solution-- PC gaming has been hurting quite a bit because it used to be that when you bought a new computer it was guaranteed to run all of the latest games. Now there are computers being sold that can still barely run a 2007 game.

I know this thread is for the desktop stuff, but its not nearly as interesting these days and it won't be until the next generation of consoles roll out so that games are being developed for bleeding edge hardware again.

Dude, it's been like that for long time. I mean, since like discrete 3D cards were out. I'd love it to be different, but simply there are office/productivity PCs and gaming PCs.
 

tokkun

Member
What's so fucking stellar about the 7970 at €480 ($610)? I don't get it, I paid €280 for my 4870 back when it released. Does AMD see a world where people are still making money hand over fist?

2 things:

1. Lack of competition at that specific performance point.
2. Unlike previous generations, the prices of the old cards never really dropped, so the last generation cards aren't the usual better value.
 

x3sphere

Member
If these cards are specced on 3.0, don't you need ivybridge to unlock them?

What do you mean by unlock?

3.0 is backwards compatible with 2.0. The HD7970 is PCI-E 3.0 and benchmarks and showed no difference -- we're still a ways out from saturating PCI-E 2.0 x16.
 
Top Bottom