• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia past generation GPUs aging terribly - version 2

NeOak

Member
The main reason is because AMD still relies on the same Hawaii chip and they need to optimize it, while Kepler was left on the roadside a long time ago in favor of Maxwell. The same will happen with Pascal, but you can expect the old GCN cards to be dropped off a cliff too once Polaris cards are out with their updated architecture. Perhaps not as drastically though, since the consoles still rely on it, but AMD driver support certainly will switch to Polaris only in no time.
It is still GCN. Do you know something about Polaris that we don't? No, right?

And please provide benchmarks before making this assumption. NVIDIA's performance loss has been proven.

I love how NVIDIA loyal fans come and try to predict the future about AMD by using NVIDIA's actions as proof of something. Why? So then people buy NVIDIA again and fall into the planned obsolescence again?
 

orochi91

Member
Yup.

I figured as much; I'll never buy an NVIDIA card because of this type of fuckery.

They're such a blatantly shady company.
 
Could this be attributed to Nvidia having great driver optimisation out of the gate and AMD having more headroom for improvements?

That's the logical conclusion. Or we can go with the more far out conclusion that Nvidia has a team that knows how developers will make their games in the future and makes sure their cards can't run those games well or something.
 

Lister

Banned
It has mostly to do with PS4 and XO (which both use AMD GCN architecture) being the lead platforms for game devs now and then the games being ported to PC.
So AMD cards (which are also GCN) benefit from that more then Nvidia.

I very much doubt that this is the case at all, since most games we're tlkaing about here aren't using the new thinner API's.

This is just AMD finally starting to catch up to Nvidia in DX11 performance due to driver optimizations.

Now, with DX12/Vulkan, AMD's GCN architecture might indeed turn out to have a slight advantage, atleast with develoeprs focusing on consoles.
 

NeOak

Member
That's the logical conclusion. Or we can go with the more far out conclusion that Nvidia has a team that knows how developers will make their games in the future and makes sure their cards can't run those games well or something.
Gameworks ran like shit on Kepler cards. I don't know if they finally fixed it.

More like, they just stop optimizing the drivers for the old cards.

And AMD catches up because they don't stop optimizing for the older GCN versions.

I really wish AMD had their flagship stuff out this year, it's rumored to be 2017 right?
Rumor is they moved it forward to October of this year
 

FireRises

Member
Gameworks ran like shit on Kepler cards. I don't know if they finally fixed it.

More like, they just stop optimizing the drivers for the old cards.

And AMD catches up because they don't stop optimizing for the older GCN versions.


Rumor is they moved it forward to October of this year

oh that'd be nice. I built my machine back in March, so I'm rocking a 970 which has faired well but I definitely would like to see what AMD has before considering the 1080/70 stuff.
 
As it should be. I have a 780 myself so I know that it's a powerful card but when I see benchmarks showing 960 outperforming the 780 it makes me think that something is not right. I'm on mobile right now so I can't post benchmarks but for Doom it was a 4-7fps advantage for the 960 I think.

Probably depends on how high textures get set?
 

mephixto

Banned
From what I gather from this thread:

a) Console factor. Developers focus on consoles that use GCN GPU's making older cards like Nvidia's Kepler in a bad spot.

b) Day 1 Optimization. Nvidia come with good optimization drivers right out the gate. While AMD need to catch up and takes his time to maximize their hardware

c) New kid on the block. Nvidia prioritize new cards and move on their new released GPU leaving older cards with bare minimun support.

d) Tin Hat Conspiracy. Nvidia is an evil corporation that only want your money and blatantly gimp their older models so people their new ones.

e) All the above.
 

NeOak

Member
b) Day 1 Optimization. Nvidia come with good optimization drivers right out the gate. While AMD need to catch up and takes his time to maximize their hardware

c) New kid on the block. Nvidia prioritize new cards and move on their new released GPU leaving older cards with bare minimun support.
These has happened before with NVIDIA. So these.
 
And note that the only DX12 game that techpowerup is using is RoTR, these numbers would be much much worse if they used any of the other DX12 or async compute titles like Ashes and Hitman.

I wonder how Maxwell will fare in the next year. The trend doesnt seem to be stopping.

They added Far Cry Primal and Hitman which even in DX11 modes are increasing GCN averages.

Which is why comparing percentages from Techpowerup over years is stupid because you are comparing numbers obtained in ever shifting pool of games.
 

gatti-man

Member
LMAO

Can just simply AMD try to play catchup while Nvi already moving forward eh, same shit every year

Yeah this could be 1 of 3 things. Show me a game on old drivers vs game with new drivers same hardware. If the fps stays the same on NVIDIA side that just means AMD is improving drivers while NVIDIA is either getting them right the first time or simply not improving them.
 

mephixto

Banned
It also makes sense for AMD to have a strong support of their older cards. If you take the Steam Hardware Survey, AMD's top cards are R9 200, HD 8800, HD 7900 and more (R9 300 series or Fury cards are missing from the list. low sales? bug?).

In the other hand you have 3 Maxwell cards on the top of the list (970 is the most used card) followed by mid/budget kepler cards. Nvidia probably noted that most of they customers upgrade their cards to the new generation more frequently than AMD's.
 

Henrar

Member
It's called planned obsolesce. nvidia wants those upgrade money.
I don't see any proof of that from these tables. Unless someone compares 780 on old drivers and on new ones in the same games in the same test locations, nothing is proven.

EDIT. And by the way, someone on Reddit did that for GF670. The guy compared all driver releases from 2012 until 2015 and found no proof of Nvidia gimping old generation of GPUs.
 

wachie

Member
They added Far Cry Primal and Hitman which even in DX11 modes are increasing GCN averages.

Which is why comparing percentages from Techpowerup over years is stupid because you are comparing numbers obtained in ever shifting pool of games.
You mean newer games, we covered that ground already.
Feel free to peruse the techpowerup review and their test rigs. With time, the test rigs get better and or newer games are added to their reviews.

It's easy to call it meaningless when with newer hardware and in newer games, the trend is consistent, across different Nvidia architectures.
 

Irobot82

Member
The main reason is because AMD still relies on the same Hawaii chip and they need to optimize it, while Kepler was left on the roadside a long time ago in favor of Maxwell. The same will happen with Pascal, but you can expect the old GCN cards to be dropped off a cliff too once Polaris cards are out with their updated architecture. Perhaps not as drastically though, since the consoles still rely on it, but AMD driver support certainly will switch to Polaris only in no time.

Kepler, Maxwell and Pascal are all just optimizations of the same architecture. They aren't all completely new re-writes.

From Kepler to Maxwell they chuncked the SM's into four smaller one.

nvidia-maxwell-performance-kepler.jpg


nvidia-kepler-vs-maxwell-sm.gif


and Pascal if a further iteration of this.

GeForce_GTX_1080_SM_Diagram_FINAL.png
 

arevin01

Member
This is why I keep older drivers installed for my cards. I use to test new drivers to see if performance increased, but the opposite usually happened in the games I played, so I stopped updating drivers.

I don't update drivers until the game tells me its mandatory.
 

Nachtmaer

Member
As most have already said, AMD has had the luxury (if you can call it that) of being stuck on the same architecture that only got minor revisions for nearly five years. They were forced to squeeze out as much performance as they could because they didn't have the resources to do a complete overhaul like nVidia did. Throw in huge layoffs and 20nm for high performance getting canned and you basically had a recipe for disaster.

In that timeframe nVidia went from Fermi to now Pascal. I guess you could argue not all of those were wildly different from one to the other. GP104 seems to be Maxwell on steroids.

There probably isn't some big conspiracy going on where they'd actively gimp their old cards. It'd be silly to take away manpower to keep extracting performance out of them, besides fixing game breaking bugs, when there's a fancy new series you can already buy. You can call it planned obsolescence or the consoles being a major factor, but the truth is probably somewhere in the middle.
 

gatti-man

Member
I don't see any proof of that from these tables. Unless someone compares 780 on old drivers and on new ones in the same games in the same test locations, nothing is proven.

EDIT. And by the way, someone on Reddit did that for GF670. The guy compared all driver releases from 2012 until 2015 and found no proof of Nvidia gimping old generation of GPUs.

So then what most likely is happening is NVIDIA shifts priority to new hardware for its drivers while AMD shares priority over most of its hardware skus even legacy. Which means if your buying new buy green buying used buy red I guess. I tend to buy new even though I know it's pretty indulgent.
 
So then what most likely is happening is NVIDIA shifts priority to new hardware for its drivers while AMD shares priority over most of its hardware skus even legacy. Which means if your buying new buy green buying used buy red I guess. I tend to buy new even though I know it's pretty indulgent.

Or if you aren't going to buy a new card each generation, AMD has another + to consider.
 

CJVaughn

Banned
How DARE you all tarnish the holy name that is NVIDIA with this blasphemous claim
Have had a 7970 for about 3 years and it has aged wonderfully. AMD is a great investment.
 
Amd have come a LONG way with their drivers.

Their GCN drivers were complete guttertrash when they first released the hd7970 etc and they've kept improving performance over time.
 

Philtastic

Member
This is easy enough to prove. Someone with an Nvidia card do a benchmark with both old and new drivers TODAY and compare them. Simple as that.
 

Unstable

Member
So then what most likely is happening is NVIDIA shifts priority to new hardware for its drivers while AMD shares priority over most of its hardware skus even legacy. Which means if your buying new buy green buying used buy red I guess. I tend to buy new even though I know it's pretty indulgent.
^ this.

Also, I'd like to point out that, unlike Nvidia, AMD has been on very similar architecture for the past few years. Polaris will change that.
 

Kaako

Felium Defensor
Puts things into perspective. AMD has better legacy support now lol. They've come a loong way it seems with driver support as well.
 

sega4ever

Member
^ this.

Also, I'd like to point out that, unlike Nvidia, AMD has been on very similar architecture for the past few years. Polaris will change that.

console devs will still optimize for ps4/neo/xbone/xbone1.5(?) so amd owners not on polaris/vega should be alright.
 

Philtastic

Member
You didnt read past the thread title.

One of the main concerns is whether Nvidia's cards have gotten worse COMPARED TO THEMSELVES, an issue that has been brought up in this thread. If this were merely a matter of drivers, we could simply install the different drivers from those time periods and verify whether or not newer drivers are gimping the hardware.

Of course, this wouldn't rule out the case that they just aren't applying optimizations for newer games for older hardware but it would show whether or not they are intentionally making older hardware slower.

EDIT: Actually, did you do any of these benchmarks yourself? If so, you have the actual frame rate data and can directly answer this accusation that people have brought up. Is there any actual degradation of frame rate over time in the same games between driver revisions? Or is just that the relative performance of AMDs cards gets better while the Nvidia cards stay the same. Conversely, do the AMD cards perform better in the same games with newer driver revisions?
 
this is a combination of various factors. the biggest is developers writing their games around AMD hardware now that on a graphics level both consoles are extremely similar to amd gpus in the pc space(this has never happened before.) optimizations have quite a bit of carry over according to various developers on B3D. amd also benefits on the driver side, their optimizations apply all the way back to their original gcn cards. their architecture is also more robust(geometry aside) due to all the hardware scheduling logic and better cache system(AMDs biggest weakness is register allocation) nvidia moved to a shared cache system and driver controlled scheduling with kepler which saves power, but isnt so good for performance or developers. nvidia performance however has not regressed when comparing a gpu to itself.
 

x3sphere

Member
I don't think Nvidia is intentionally crippling old cards, I think Kepler was not as forward thinking of an architecture (at least for games). Time will tell if same happens to Maxwell, I'm thinking it won't since Pascal seems to be a smaller change comparatively.

There is a tidbit from Kyle at [H] that I found interesting, in his original GTX Titan review, he says the GK110 silicon used in 780, 780 Ti and Titan was never meant to be used on a retail video card. I wouldn't be surprised if NV knew full well it wouldn't hold up as well long term for gaming.

Kyle’s thoughts: I want to comment and share opinion here on several fronts when it comes to GeForce GTX Titan. While you will find all kinds of rumors and statements made about the GK110 silicon, it was never meant to end up on a retail video card. What we are seeing in Titan is a reaction from NVIDIA to what it thought AMD was going to launch and NVIDIA did not want to be seen as having no answer. AMD has gotten a lot better in the last couple of years of holding its cards close to its vest and simply put NVIDIA read its competition wrong and felt as though it was going to be in a position that it had to have a new product; and it did not. So we have a Titan launch and AMD has nothing hardware-wise.
 

Engell

Member
This is because:
1) ATI(now AMD) hardware was always awesome, but their drivers where always terrible.. This is more under control now and drivers improved dramatically within the last year, that is why you see the great performance now. Try taking a driver from two years ago and put it up against nvidia driver from two years ago... The nvidia card will most likely slaughter the AMD card.

2) all consoles use AMD hardware, therefore all games are optimized for AMD hardware... AMD is already winning the gpu race, it was a huge mistake for nvidia not to get in on this (yes i understand they don't have x86 arch).
 

TSM

Member
What's funny is that another way to phrase the OP is "Why is it that AMD hardware performance starts off so terribly and you have to wait months or years for it to reach it's potential?"
 

dr_rus

Member
The main reason you're seeing this is because console engines are being more and more optimized for GCN (console) h/w which in turns means that they are less and less optimized for NV h/w (as these optimizations tend to go into the opposite direction these days) - only a handful of developers are spending any measurable amount of resources on PC version renderer optimizations, most just dump the code from console versions, add some LOD tweaking options and you're good to go.

If you reflect back a bit - Kepler was launched 1.5 years before the new consoles, and the most issues with it have started last year around this time - some 1.5-2 years after the new console launches, when devs have pushed the most out of the new console h/w. Same issues are manifesting themselves on Maxwell as well but to a lesser degree as Maxwell fixed a lot of Kepler's slow paths.

Polaris is unlikely to be a big change of GCN architecture and will probably bring improvements in vein of the usual previous GCNx updates. However I think that things will be a bit different going forward.

A. Because Pascal is basically Maxwell which means that the single NV optimization target will remain for the next year and this should help with PC optimizations where they are actually being implemented (supporting FL11_0 Kepler at this point is a pain in the ass for most multiplatform developers so this is essentially being dropped slowly but inevitably).

B. Because console optimizations are likely to be close to their peak at the moment and thus it's possible that GCN h/w won't receive as much uplifts from consoles as it did during the previous two years. The big unknown here is the influence PS4K/Neo will have on the scene though. It's possible that thanks to it we'll see a continuation of that trend on higher end of Radeon lineup (lower end is likely pushed to it's limits by PS4/XBO already).

Kepler, Maxwell and Pascal are all just optimizations of the same architecture. They aren't all completely new re-writes.
That's not accurate. Maxwell and Pascal are very close, that's true, but Kepler is a different architecture which is closer to Fermi (and even Tesla in some regards) than to Maxwell. Obviously no new architecture is a "complete re-write" as that would be very inefficient from R&D point of view. But the amount of changes which were put into Maxwell is actually very high. The next NV's arch which may have the same magnitude of refresh will be Volta, and then, yeah, something like Kepler-Maxwell can happen again, with Pascal GPUs loosing a lot of ground to the newer Volta GPUs.

I'm optimistic on this though as Pascal should be pretty good at handling most of GCN console code and because of this I think it's unlikely that there will be a lot of areas where radical improvements of Volta will bring appropriately radical performance gains when compared to Pascal. Volta's arch may actually be a bit too complex for the typical code it will run in 2017/18 as it's likely to push beyond DX12 FL12_1. But guessing this is always inaccurate.

One of the main concerns is whether Nvidia's cards have gotten worse COMPARED TO THEMSELVES, an issue that has been brought up in this thread. If this were merely a matter of drivers, we could simply install the different drivers from those time periods and verify whether or not newer drivers are gimping the hardware.

They didn't. This isn't an issue of drivers or gimping, it's an issue of the general change in the average rendering code practices which is happening because of new console h/w.
 

Philtastic

Member
Now that I have more time to formulate a post, here's the experiment that I think would need to be done to truly quantify this which I allude to earlier.

Take an older gen and new gen version of both Nvidia and AMD video cards, eg. 780 Ti & 980 Ti and 290X & 390X (tried to pick cards that launched around the same time but difficult to do). Pick a suite of recent games. To establish baseline performance before game-specific optimizations are added in, we can install the AMD drivers from when the 390X launched in June 2015 for both the 290X & 390X and install the Nvidia drivers from when the 980 Ti launched also in June 2015 for both the 780 Ti & 980 Ti. Benchmark games that came out after these June drivers to see how they run without game-specific optimizations. Now, install the latest drivers for all cards and run those same benchmarks again.

With this data, we can then measure the improvement in performance
1) for the same card using different drivers which reflects game-specific optimizations
2) between cards of the same manufacturer to see whether each card gets the same boost in performance.

If Nvidia or AMD are not applying game-specific optimizations to the older cards, we wouldn't see much if any boost for the older cards but we would see a significant improvement for the newer cards. If they are fully applying optimizations and the game caters well to all architectures, then we would see similar/equal boosts between the older and newer cards. If the older cards get some significant boost but less than the newer cards, we might be more likely to say that there isn't a conspiracy to gimp older cards since they're still seeing significant improvements, just not as large as newer architectures are.

This is a lot of work so I'm not actually expecting anyone to do this but it makes for an interesting thought experiment.

Edit: I forgot a 4th scenario - no improvements for both older and newer cards. This would also tell us that the manufacturer is not doing anything shady since they aren't implementing performance improvements for either card.
 
Thid is because:
1) ATI(now AMD) hardware was always awesome, but their drivers where always terrible.. This is more under control now and drivers improved dramatically within the last year, that is why you see the great performance now. Try taking a driver from two years ago and put it up against nvidia driver from two years ago... The nvidia card will most likely slaughter the AMD card.

2) all consoles use AMD hardware, therefore all games are optimized for AMD hardware... AMD is already winning the gpu race, it was a huge mistake for nvidia not to get in on this (yes i understand they don't have x86 arch).

I agree. These are 2 big factors for AMD that leads me to believe they will bounce back this gen.
 
What's funny is that another way to phrase the OP is "Why is it that AMD hardware performance starts off so terribly and you have to wait months or years for it to reach it's potential?"
No kidding. Hell, you could still cheerlead for AMD -- "AMD's drivers improve more over time than Nvidia's" -- without FUDing Nvidia.

The actual information that would support the thread's assertion is Nvidia hardware running worse with newer drivers, which doesn't happen. There's no "aging" going on here by any reasonable definition.
 
Top Bottom