• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia past generation GPUs aging terribly

Skyzard

Banned
Kepler as an architecture came out in 2012. The 780Ti is just a different version of the Titan, a GK110, which came out early in 2013. It's not unreasonable that Nvidia would have realized most of Kepler and GK110's potential by now.

Do they still make proper use of their potential, one year after releasing a card in that range.

Nope, gots maxwell to sell.

Then next year it's pascal.

More people need to start seeing graphics card purchases as hardware and driver support. Hardware alone doesn't get you what you'd expect it should after a little while, when they drop proper support. 1 year. And they're charging £500 for the cards. And hope people SLI.

That's a good business model. It's fucking awful for consumers who get the short end of the stick and pay for performance percentage increases they could have already had with their current hardware with decent support.
 

Seanspeed

Banned
Do they still make proper use of their potential, one year after releasing a card in that range.

Nope, gots maxwell to sell.

Then next year it's pascal.

More people need to start seeing graphics card purchases as hardware and driver support. Hardware alone doesn't get you what you'd expect after a while.
Not sure you even read what I wrote...?
 

Wag

Member
My 3 OG Titans perform like shit in the Witcher 3 @ 4k/60Hz, even with Hairworks off and some of the features turned off. I definitely need a new card (or 2).
 

Skyzard

Banned
Sure the first kepler card came out earlier than the 780ti...that was expressed in my first sentence. Realizing their max potential and having driver support to actually make use of it in new games are the same thing?

3OG titans bruh. Turned to sludge. Much disrespect.
 

Chesskid1

Banned
i feel sorry for those who bought a g-sync monitor

they pretty much locked themselves into buying nvidia cards for forever when PC gaming is supposed to be about options. what if the next AMD gpu smashes nvidia?

and now nvidia cards degrading in performance, so they want to upgrade quicker

nvidia figured out how to get all your money, damn. kudos to them.
 

lmimmfn

Member
this is pretty old news, they released their new driver with Keplar fixes the night before the 980Ti was released so benchmarks would look much better between 780Ti/780->980Ti( and google search for months will show those initial benchmarks ).

I get ~ 10% performance increase with the "fixed" Keplar driver but it keeps crashing so im back on the older stable ones, they seem to be looking into it according to ManuelG - http://forums.guru3d.com/showthread.php?t=399780
 

Narroo

Member
Do they still make proper use of their potential, one year after releasing a card in that range.

Nope, gots maxwell to sell.

Then next year it's pascal.

More people need to start seeing graphics card purchases as hardware and driver support. Hardware alone doesn't get you what you'd expect it should after a little while, when they drop proper support. 1 year. And they're charging £500 for the cards.

That's a good business model. It's fucking awful for consumers who get the short end of the stick and pay for performance percentage increases they could have already had with their current hardware with decent support.
Time to start moving back to consoles, everyone.
 

Seanspeed

Banned
Sure the original kepler came out earlier than the 780ti...that was in my post, in my way :p
And the point is that there may simply not be anymore useful improvements to pull from Kepler and GK110.

Kepler is three years old and GK110 is more than two years old.
 

Durante

Member
Kepler as an architecture came out in 2012. The 780Ti is just a different version of the Titan, a GK110, which came out early in 2013. It's not unreasonable that Nvidia would have realized most of Kepler and GK110's potential by now.
Computerbase actually did an in-depth 2-year study of driver performance with many games towards the end of 2014. It covers driver versions starting from January 2013 until that point in time, and was performed on Kepler for NV obviously.

These are the results:
treiberj2xzk.png


As you can see, the total improvement over this timespan on Kepler was almost 20%. What you also see is that this improvement started high and then flattened out more and more.

Whenever a new architecture is out, it will open up many new optimization opportunities in software. The fact that now Maxwell, which is far newer and thus less optimized gets larger improvements in new driver versions is perfectly normal, and decidedly not some sign of a huge conspiracy.

i feel sorry for those who bought a g-sync monitor
Those poor people using the best monitor technology for gaming currently available. Yeah, we really should feel for them.
 

Nikodemos

Member
nvidia figured out how to get all your money, damn. kudos to them.
More like nVidia figured out relatively early on that the PC hardware market is slowly becoming more and more otaku-ified, as lower-spec desktops dwindle in favour of enthusiast builds. Just like mobile game devs, they realised there's good money to be made in whale farming.
 

lmimmfn

Member
Whenever a new architecture is out, it will open up many new optimization opportunities in software. The fact that now Maxwell, which is far newer and thus less optimized gets larger improvements in new driver versions is perfectly normal, and decidedly not some sign of a huge conspiracy.
The problem is when they stop optimizing older hardware for new games. On average Kepler performance is ok but benchmarks of GTA V and Witcher 3 show a 780Ti as being quite a bit slower than a 970
 

kinggroin

Banned
GTX 780 6GB? I have the same card and I'm unconvinced that it couldn't be doing better in The Witcher 3. Don't get me wrong, Witcher 3 looks great, but not that much better than every other game I have which don't drop below 60fps.
I dunno, maybe I'm wrong but I suspect Nvidia could get more performance out of these cards if they had any incentive to do so.

If this was a closed platform you may be right. Otherwise, that isn't an issue exclusive to Kepler GPUs, but PC gaming in general
 

Lulubop

Member
More like nVidia figured out relatively early on that the PC hardware market is slowly becoming more and more otaku-ified, as lower-spec desktops dwindle in favour of enthusiast builds. Just like mobile game devs, they realised there's good money to be made in whale farming.

The fuck are some of you going on about?
 

Durante

Member
More like nVidia figured out relatively early on that the PC hardware market is slowly becoming more and more otaku-ified, as lower-spec desktops dwindle in favour of enthusiast builds. Just like mobile game devs, they realised there's good money to be made in whale farming.
Yeah, shame on Nvidia for kicking the monitor market out of basically a decade of stagnancy. I really hate the fact that suddenly it's apparently possible to create fast-responding IPS panels and get rid of display driving concepts rooted in the 90s which make no sense now.

And also shame on them for actually catering to enthusiast gamers.

You know what this is? Envy.

The fuck are some of you going on about?
See above.
 

Seanspeed

Banned
Computerbase actually did an in-depth 2-year study of driver performance with many games towards the end of 2014. It covers driver versions starting from January 2013 until that point in time, and was performed on Kepler for NV obviously.

These are the results:
treiberj2xzk.png


As you can see, the total improvement over this timespan on Kepler was almost 20%. What you also see is that this improvement started high and then flattened out more and more.

Whenever a new architecture is out, it will open up many new optimization opportunities in software. The fact that now Maxwell, which is far newer and thus less optimized gets larger improvements in new driver versions is perfectly normal, and decidedly not some sign of a huge conspiracy.
Interesting stuff.
 

wachie

Member
Nvidia was catering to enthusiast gamers pre-Titan era also, just not with those insane pricing tiers. What Nikodemos meant was probably this but okay everything is about G-Sync!
 

Nikodemos

Member
Yeah, shame on Nvidia for kicking the monitor market out of basically a decade of stagnancy. I really hate the fact that suddenly it's apparently possible to create fast-responding IPS panels and get rid of display driving concepts rooted in the 90s which make no sense now.
VR makes the likes of GSync/FreeSync irrelevant, since variable refresh is useless in visors. You need a locked native 60/120 (Morpheus) or 90 (IIRC for Rift/Vive) fps output from your card or else you'll hurl.

They're interesting/nice stopgaps until displays become obsolescent due to VR-compatible GUIs, to be sure.

Nvidia was catering to enthusiast gamers pre-Titan era also, just not with those insane pricing tiers. What Nikodemos meant was probably this but okay everything is about G-Sync!
Pretty much. Don't know where GSync came into the discussion, since I didn't mention it.
 

dr_rus

Member
Computerbase actually did an in-depth 2-year study of driver performance with many games towards the end of 2014. It covers driver versions starting from January 2013 until that point in time, and was performed on Kepler for NV obviously.

These are the results:
treiberj2xzk.png


As you can see, the total improvement over this timespan on Kepler was almost 20%. What you also see is that this improvement started high and then flattened out more and more.

Whenever a new architecture is out, it will open up many new optimization opportunities in software. The fact that now Maxwell, which is far newer and thus less optimized gets larger improvements in new driver versions is perfectly normal, and decidedly not some sign of a huge conspiracy.

Those poor people using the best monitor technology for gaming currently available. Yeah, we really should feel for them.

Shhhhh. Terribly!
 

Skyzard

Banned
Computerbase actually did an in-depth 2-year study of driver performance with many games towards the end of 2014. It covers driver versions starting from January 2013 until that point in time, and was performed on Kepler for NV obviously.

These are the results:
treiberj2xzk.png


As you can see, the total improvement over this timespan on Kepler was almost 20%. What you also see is that this improvement started high and then flattened out more and more.

Whenever a new architecture is out, it will open up many new optimization opportunities in software. The fact that now Maxwell, which is far newer and thus less optimized gets larger improvements in new driver versions is perfectly normal, and decidedly not some sign of a huge conspiracy.

Those poor people using the best monitor technology for gaming currently available. Yeah, we really should feel for them.

That shows the same for AMD though doesn't it?

Yet we have people saying a reason why AMD cards that were on par with Nvidia cards previously and are now beating them in new game performance is due to advancements in drivers (amd being slower at them) and not Nvidia underutilizing them.
 

SparkTR

Member
VR makes the likes of GSync/FreeSync irrelevant, since variable refresh is useless in visors. You need a locked native 60/120 (Morpheus) or 90 (IIRC for Rift/Vive) fps output from your card or else you'll hurl.

They're interesting/nice stopgaps until displays become obsolescent due to VR-compatible GUIs, to be sure.

VR isn't going to be the be-all-end-all, it's going to offer different experiences than what you can get with tradition set-ups, not necessarily better experiences. Palmer Luckey talked about the drawbacks of VR is genres such as arena shooters (UT, TF2) for example.
 

Elman

Member
i feel sorry for those who bought a g-sync monitor

What a shame. Those were good people...what a rotten thing to buy.


If any of these poor souls need to free themselves from the shackles of G-SYNC and NVIDIA, I will happily take their G-SYNC monitor...because it's the right thing to do.
 

Durante

Member
Pretty much. Don't know where GSync came into the discussion, since I didn't mention it.
Perhaps you should have read the post you replied to? Threads of conversation and all that.

That shows the same for AMD though doesn't it?
It shows an 11.5% improvement for AMD and a 18.4% improvement for Kepler.

The larger point I was making is that improvements surge with the release of new hardware and then stagnate. For everyone. Maxwell surged (like everything) after its release, so it's now further ahead of Kepler than it was at release.
 

Skyzard

Banned
Perhaps you should have read the post you replied to? Threads of conversation and all that.

It shows an 11.5% improvement for AMD and a 18.4% improvement for Kepler.

The larger point I was making is that improvements surge with the release of new hardware and then stagnate. For everyone. Maxwell surged (like everything) after its release, so it's now further ahead of Kepler than it was at release.

That's seems even worse though. AMD and Nvidia both flatten out. Somehow AMD cards pull ahead of Nvidia cards they used to be on par or behind with driver advancements stagnated after behind...
 
My Gtx 780 is underperforming in every recent big game ever since I got it. I don't even know why I payed 530$ for a "high-end" card. I will be extremely wary of which GPU I will get next, and I will switch very soon.

I hate Nvidia and I will try to not deal with them ever again. They have terrible customer service, terrible product support and silly GPU prices. I hope that someone will sue them soon for their shady driver support (driver "bugs" that affected kepler GPUs lol) and gameworks; they are a monopoly in the first place.
 
I hate Nvidia and I will try to not deal with them ever again. They have terrible customer service, terrible product support and silly GPU prices. I hope that someone will sue them soon for their shady driver support and gameworks; they are a monopoly in the first place.

Slightly OT, but, you bought the card directly from NV?
 

lmimmfn

Member
980 performance on release vs Keplar/AMD performance
index.php


980 performance in Witcher vs Keplar & AMD
index.php


Understood that performance would improve on the newer cards but 970 jumping from -10% vs a 780Ti in Crysis 3 to +20% in Witcher 3 is absolutely ridiculous

The latest 353.06 Keplar fix drivers improve performance by ~10% in Witcher 3 but still.
 

Durante

Member
That's seems even worse though. AMD and Nvidia both flatten out. Somehow AMD cards pull ahead of Nvidia cards they used to be on par or behind with driver advancements stagnated after behind...
I don't really understand your argument. NV reached larger improvements more quickly on the driver level, how is that worse?

Understood that performance would improve on the newer cards but 970 jumping from -10% vs a 780Ti in Crysis 3 to +20% in Witcher 3 is absolutely ridiculous

The latest 353.06 Keplar fix drivers improve performance by ~10% in Witcher 3 but still.
I assume "Ultra" means that hairworks was on in those benchmarks? If so, that simply shows that Maxwell is better at tessellation.
 

Skyzard

Banned
^ Worse about the kepler conspiracy, not performance improvements and ability within the first year of release.

Kepler even gets a significant fps bump by switching physx to cpu instead of autoselect in the game being talked about here - tw3.



Always buy amazon if you can bare the wait. Yes they can be slow getting the parts but that customer service is unrivaled and worth the wait really, especially for expensive purchases like this. Do nvidia even sell cards directly? Or do you mean only nvidia parts and not like msi or evga...because nvidia don't make good enough fans to keep the clocks without giving your PC some levitation.
 

Cerity

Member
I thought this was generally a known thing if you've kept an eye on driver releases? I remember AMD doing the same thing during their 4xxx, 5xxx and 6xxx series.
 

Nikodemos

Member
I assume "Ultra" means that hairworks was on in those benchmarks? If so, that simply shows that Maxwell is better at tessellation.
I'm pretty certain that is the case. Simplest way to check is if the 285 is above or below the 280X, framerate-wise. In this test, the 285 is a full 5 frames above the 280X, despite the latter's wider memory interface and larger pool.
 

napata

Member
I assume "Ultra" means that hairworks was on in those benchmarks? If so, that simply shows that Maxwell is better at tessellation.

I'm pretty certain that is the case. Simplest way to check is if the 285 is above or below the 280X, framerate-wise. In this test, the 285 is a full 5 frames above the 280X, despite the latter's wider memory interface and larger pool.

Hairworks is off and I thought the Witcher 3 hardly used tesselation?
 

Skyzard

Banned
I thought this was generally a known thing if you've kept an eye on driver releases? I remember AMD doing the same thing during their 4xxx, 5xxx and 6xxx series.

I remember they used to have more and more crossfire disabled *fixes* when the new series landed. And that I just stopped updating drivers after a year and a bit...

I'm pretty certain that is the case. Simplest way to check is if the 285 is above or below the 280X, framerate-wise. In this test, the 285 is a full 5 frames above the 280X, despite the latter's wider memory interface and larger pool.

TW3 changes what ULTRA means depending on the card afaik (full/half/no hairworks etc).
 

TSM

Member
I don't understand the OP's premise. Why would anyone compare these cards against each other without using actual frame rate numbers? Even then there are things like frame times and other factors. There are plenty of benchmarks out there with actual numbers as opposed to deciding to relatively compare cards that are being relatively compared to other cards.
 

The Llama

Member
I don't understand the OP's premise. Why would anyone compare these cards against each other without using actual frame rate numbers? Even then there are things like frame times and other factors. There are plenty of benchmarks out there with actual numbers as opposed to deciding to relatively compare cards that are being relatively compared to other cards.

The idea of using percentages is to be able to compare several games at once. Comparing framerates wouldn't be a good idea because of how different the demands can be from game to game.
 
Do they still make proper use of their potential, one year after releasing a card in that range.

Nope, gots maxwell to sell.

Then next year it's pascal.

More people need to start seeing graphics card purchases as hardware and driver support. Hardware alone doesn't get you what you'd expect it should after a little while, when they drop proper support. 1 year. And they're charging £500 for the cards. And hope people SLI.

That's a good business model. It's fucking awful for consumers who get the short end of the stick and pay for performance percentage increases they could have already had with their current hardware with decent support.

Agreed. Surely, if you give consumers good support / value on the lifetime of a card, that'd cultivate brand loyalty?

I see people dismissing concerns as "tin foil" conspiracies, yet Nvidia had the gall to launch a 4GB card as a 3.5GB card, then try to spin it as "working as intended". Then we had people in denial heavily downplaying the importance of VRAM, trying to dismiss the issue.
 
So, what i'm understanding here (and I'm not sure if i'm right), is that the performance between driver versions on the same card, running the same game, is actually DROPPING on Green Team GPUs?
 

Nikodemos

Member
Past generation? Look how poor the Titan X aged in two months.
Not least due to announcements like the 980ti making the T-X look like a pretty bad sell for both those who bought it and those who didn't.

TW3 changes what ULTRA means depending on the card afaik (full/half/no hairworks etc).
Given the relatively lacklustre performance of 700 series cards, that feature might not be working properly.
 
The idea of using percentages is to be able to compare several games at once. Comparing framerates wouldn't be a good idea because of how different the demands can be from game to game.

It would be nice to know if Kepler's performance has decreased in an absolute sense since launch (in the same game). Maybe someone already addressed this in the thread and I missed it.
 

The Llama

Member
It would be nice to know if Kepler's performance has decreased in an absolute sense since launch (in the same game). Maybe someone already addressed this in the thread and I missed it.

I don't think anyone's tested it but I'd really, really, really doubt that.
 

meanspartan

Member
Radeons go much longer in my experience.

5770 took me from 2009 to 2013 and still hit at least medium settings at 1080p on most new games.

7850 from 2013 til recently, when my friend gave me a crazy good deal on his used 280x, my current card. Probably coulda kept the 7850 goin for 2 more years.
 

mkenyon

Banned
Whenever a new architecture is out, it will open up many new optimization opportunities in software. The fact that now Maxwell, which is far newer and thus less optimized gets larger improvements in new driver versions is perfectly normal, and decidedly not some sign of a huge conspiracy.

Those poor people using the best monitor technology for gaming currently available. Yeah, we really should feel for them.
Durante, is the obvious reason for continued GCN performance improvements (poor AMD driver performance) on chips as old as the 7970/280X really the only factor at play? Or are these just overly powerful cards that are difficult for one reason or another to get the most out of?

This is kind of a complicated question, mostly because I don't 100% know what I'm asking. But..

Older GCN cards get continued performance improvements on newer games in late 2014 and 2015. They do not get similar gains on *older* games. That seems to speak to the fact that these aren't generalized performance tweaks for the architecture as a whole, but rather for specific engines/games, yes?

Which leads to my main question, though this could be invalidated by me losing perspective above. Why would overhead exist in new games for old GCN cards, but the same is not true for Kepler?
It would be nice to know if Kepler's performance has decreased in an absolute sense since launch (in the same game). Maybe someone already addressed this in the thread and I missed it.
No, it has not. Check out the links in the OP. It's a complex comparative analysis that pits relative performance of various GCN cards vs. Kepler cards in 2013/14 games vs. late 2014/15 games.
 
I thought this was generally a known thing if you've kept an eye on driver releases? I remember AMD doing the same thing during their 4xxx, 5xxx and 6xxx series.

The problem is neither AMD or Nvidia are "doing" anything. The conspiracy theories are pretty funny though. I'm guessing the real problem is that video card prices on the high end have been so inflated lately that people who dropped $1k+ on OG Titans and are suddenly being outperformed by new-gen $300 cards are more unhappy about it than the days when the high end topped out at $500.

It's not a problem with the lack of old generation driver optimizations, because that's always how it's been. New generation comes out, old generation optimization stops. But now there are cards in the old generation which costed four figures and the pain of this driver EOL is magnified more than before.
 

The Llama

Member
Durante, is the obvious reason for continued GCN performance improvements (poor AMD driver performance) on chips as old as the 7970/280X really the only factor at play? Or are these just overly powerful cards that are difficult for one reason or another to get the most out of?

This is kind of a complicated question, mostly because I don't 100% know what I'm asking. But..

Older GCN cards get continued performance improvements on newer games in late 2014 and 2015. They do not get similar gains on *older* games. That seems to speak to the fact that these aren't generalized performance tweaks for the architecture as a whole, but rather for specific engines/games, yes?

Which leads to my main question, though this could be invalidated by me losing perspective above. Why would overhead exist in new games for old GCN cards, but the same is not true for Kepler?

I could see a situation where AMD got better at optimizing for GCN over time, hence why newer games show better relative performance. I don't actually think that's whats happening though.
 

Theonik

Member
I'm a bit confused here are we saying that Radeon GPU's get more powerful over time? Or even weirder are we saying that Nvidia GPU's get weaker over time?
nVidia has admitted to dropping the ball on the 7xx series drivers after maxwell. You could I suppose say they are doing this on purpose to sell 9xx gpus, though you'd probably sound like an idiot. Changes in architecture will usually mean that actual performance gaps will widen once you start getting into optimisations for newer cards.

Not that they shouldn't look to rectify this.
Edit: Mind, nVidia is in the business of fucking up their driver codebase and doing crazy game specific driver optimisations which have been lacking on keppler, so that might be a lack of effort on their part. Thank fuck Vulcan and DX12 might actually end this madness.
 
Top Bottom