• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • Hey Guest. Check out the NeoGAF 2.2 Update Thread for details on our new Giphy integration and other new features.

Nvidia past generation GPUs aging terribly

wachie

Member
Oct 6, 2014
5,848
224
610
I wasnt aware of this until mkenyon (credit to Crisium) brought it up in this thread here.

GTX 780Ti
Lead over 290X = +9.7% (Nov 7, 2013)
Lead now = +4.2%
Lead over 290X decreased over 50%.

GTX 780
Lead over 290X = -7% (Oct 23, 2013)
Lead now = -16.3%
290X lead increased over 100%.

GTX 760
Lead over 270X = +12% (Oct 7, 2013
Lead now = +4.6%
Lead over 270X decreased over 60%.

So you can see across the board - big Kepler, performance Kepler and mainstream Kepler all have seen their performance age not too well. Really hoping this is an isolated case and does not repeat with Maxwell. You can also read the comparison by Crisium.

Who keeps their cards over a year anyway, right? RIGHT!!!1
 

bomblord1

Banned
Sep 6, 2014
9,616
2
0
I'm a bit confused here are we saying that Radeon GPU's get more powerful over time? Or even weirder are we saying that Nvidia GPU's get weaker over time?
 

Serick

Married Member
Jun 26, 2013
1,322
0
580
Yeahhhh, if the 980 ends up in the same boat I may switch to the red team (simply because I want to get off the upgrade every year cycle that I'm currently on.)
 

Septimus

Member
May 30, 2007
2,541
0
1,140
OC, California
I'm a bit confused here are we saying that Radeon GPU's get more powerful over time? Or even weirder are we saying that Nvidia GPU's get weaker over time?

I think what is implied is that they don't continue to get optimized in hopes that people keep buying newer cards.
 

zon

Member
Apr 15, 2006
6,004
0
0
I'm a bit confused here are we saying that Radeon GPU's get more powerful over time? Or even weirder are we saying that Nvidia GPU's get weaker over time?

More like AMD takes much longer to make good drivers than Nvidia.
 

kabel

Member
May 25, 2014
2,540
0
325
Berlin
I'm a bit confused here are we saying that Radeon GPU's get more powerful over time? Or even weirder are we saying that Nvidia GPU's get weaker over time?

I actually think that Nvidia applied the Apple OS update strategy on their GPU drivers.

Maxwell has to look more powerfull.
 

GhostTrick

Banned
Jan 11, 2012
16,580
12
0
Are we comparing with the same games at least ? Also which games ?
Because if its to compare a game where your GPU scores 180fps and another one scores 150fps...
 

LilJoka

Member
Dec 22, 2013
6,122
2
0
London, UK
Its been known to me since the 970 debacle, and why i was so vocal about the 3.5GB issue. NVidia dropped optimisation or the new games are taking advantage of architecture that is more prevalent on AMD cards and Maxwell compared to Kepler. If its the former, i fear for the 970.
 

Serick

Married Member
Jun 26, 2013
1,322
0
580
Don't Nvidia cards suffer from RAM rot over time?

Ram rot? No idea what that is, explain/link?

But no, it's just driver support. AMD benefits from using the same architecture (GCN) for longer so their cards see better optimization over time.

This is either a contradiction or I've forgotten how to read.

Those are negatives. The 780 was 7% slower now it's 16.3% slower.
 

10k

Banned
Mar 20, 2012
15,859
1
0
32
Toronto, Ontario, Canada
My 780 is struggling to hit 1440p60 on multiple games unless I turn down shadows and AF and sometimes even texture detail (3GB is not enough). That's why I bought the super clocked 980Ti. I'm hoping it doesn't become subpar in 2-3 years like the 780 did.
 

orochi91

Member
Jan 10, 2014
12,427
0
550
NVIDIA probably wants to make upgrading as appealing as possible.

Pretty scummy approach, and I hope AMD doesn't go down this path :/

I actually think that Nvidia applied the Apple OS update strategy on their GPU drivers.

Maxwell has to look more powerfull.

+1

Exactly what I was thinking.
 

Ivan Amiibo

Banned
Mar 14, 2015
4,651
0
0
speaking of, is there an inexpensive, less power hungry card to upgrade my 570 to? I don't even mind if it's around the same level of horsepower, just less power consumption would be great.

edit: should prob ask in the pc parts thread, sry
 

Akai__

Member
Jun 14, 2012
9,691
0
0
Still on my 780Ti and I plan on keeping it until Pascal hits. Can't complain either, because all I do is 1080p @60FPS and it's awesome.
 

Mrbob

Member
Jun 7, 2004
63,747
6
0
Crazy results. Wonder if this will catch on and more investigative digging is done.

Maybe I'll stay with the red time after all for my home theater PC.
 

wachie

Member
Oct 6, 2014
5,848
224
610
It'll be interesting to see how DX12 rearranges these charts if at all.
Are you expecting it to get worse or better? Because I thought it was well established that AMD's GPUs had worse CPU overhead.
 

Dictator93

Member
Jun 29, 2011
23,812
4
660
It is not surprising that Maxwell is much better than Kepler at a lot of things.

Still, it sucks to have a kepler card which appears to be "under-performing".
 

DarthWoo

I'm glad Grandpa porked a Chinese Muslim
Jun 9, 2004
6,093
0
1,625
39
GTX 750 Ti or GTX 960 4GB.

I've got to second the 750 Ti. I bought a cheapy Dell system (Inspiron 3847 w/ i5-4460 for $250+tax) earlier this year that had a 300W PSU and no included GPU. I put the 750 Ti in, and it only requires the PCIe slot and no extra connectors from the PSU, and it runs wonderfully. I don't have any of the newest games to test on it, but it runs stuff like BF3, SC2, Star Trek Online and whatever just fine.

Edit: Mind you that I got the EVGA factory-OC'd version, but that doesn't really impact the power requirement at all. They do make an extra large version that I believe does require one of the 6-pins, but that one was actually physically too large to fit in this Dell's case anyway. The standard 750 Ti is dual slot but less than 7 inches long, so it's a pretty tiny thing for what it does.
 

FunkyLounge

Member
Jun 23, 2013
1,176
0
0
Post-release performance increases via driver updates. Seems like AMD upgrades their performance more than Nvidia.

As stated above - AMD takes more time to complete the drivers. Nvidia cards are not aging at all, this has nothing to do with Nvidia, what a misleading title!

AMD just aren't ready on launch
 

wachie

Member
Oct 6, 2014
5,848
224
610
I've got to second the 750 Ti. I bought a cheapy Dell system (Inspiron 3847 w/ i5-4460 for $250+tax) earlier this year that had a 300W PSU and no included GPU. I put the 750 Ti in, and it only requires the PCIe slot and no extra connectors from the PSU, and it runs wonderfully. I don't have any of the newest games to test on it, but it runs stuff like BF3, SC2, Star Trek Online and whatever just fine.
Another plus point about that little 750Ti is it's actually Maxwell GPU despite the 700 series sticker.
 

Dictator93

Member
Jun 29, 2011
23,812
4
660
Yes, the iPhone 6 is also much better than the iPhone 5. Yes, indeed.

I dont get your point.

I mean, the maxwell architecture does things like tessellation and compute a lot better. So if a game uses those things, it will skew the performance to be more lopsided.

I don't mean it in terms of linear peformance like: 760 is worse than 780 Ti.
 

BlazinAm

Junior Member
Mar 11, 2012
4,015
7
530
www.twitch.tv
I guess you have to have strong long term driver support if your going to keep revising the same chipset and releasing it over and over.
 

Amey

Member
Jan 26, 2013
686
242
625
Benchmark suite probably isn't same in both cases as it's a span of over 1.5 years.
If you throw a few games in there that favor AMD or Nvidia hardware then surely the average will change accordingly.

Basically you can't rely on TPU's average graph. You need to collect only those benchmarks that are common in both tests i.e 2013 and 2015 and then take their average. Need to exclude all other benches.
 

wachie

Member
Oct 6, 2014
5,848
224
610
I mean, the maxwell architecture does things like tessellation and compute a lot better. So if a game uses those things, it will skew the performance to be more lopsided.
Well obviously Maxwell would be better than Kepler, its a new architecture. The comparison point isnt between Maxwell and Kepler here, it's between Kepler and the AMD counterparts.
 

FunkyLounge

Member
Jun 23, 2013
1,176
0
0
AMD releases Project Cars drivers 1 month after launch of the game, which improves performance.

"Nvidia GPUs aged really bad during the last month compared to AMD"

That's basically what I see, am I missing something here?
 

CypherSignal

Member
Feb 20, 2011
736
0
0
Montreal
Why the heck is anyone looking at relative performance for this sort of thing?! Are the older GPUs actually becoming slower in absolute numbers as drivers are updated, or (far more likely) nVidia is focusing their efforts on optimizing various things on their newest cards, causing them to become faster in absolute numbers?
 

PumpkinSpice

Banned
Aug 27, 2013
6,334
1
0
Why the heck is anyone looking at relative performance for this sort of thing?! Are the older GPUs actually becoming slower in absolute numbers as drivers are updated, or (far more likely) nVidia is focusing their efforts on optimizing various things on their newest cards, causing them to become faster in absolute numbers?

Seems like the newest games are getting way more attention in terms of optimizing on the driver side on the latest generation of cards, and older cards aren't. This isn't happening as quickly on the AMD side.
 

Rur0ni

Member
Aug 6, 2005
7,136
0
0
AMD releases Project Cars drivers 1 month after launch of the game, which improves performance.

"Nvidia GPUs aged really bad during the last month compared to AMD"

That's basically what I see, am I missing something here?
You're onto something.
 

Dictator93

Member
Jun 29, 2011
23,812
4
660
Well obviously Maxwell would be better than Kepler, its a new architecture. The comparison point isnt between Maxwell and Kepler here, it's between Kepler and the AMD counterparts.

I think everyone knew that GCN was better at a number of things than kepler. It was present also at release.

As more games take advantage of core DX11 features... it suffers on kepler and does better on GCN. Especially since games are gears toward the GCN consoles.... hence how they will use these methods more often as a base.

I just do not find anything very surprising.