• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia's Gameworks nerfing AMD cards and its own previous-gen GPU's?

DeepEnigma

Gold Member
This happens with Apple and iOS too.

There is no reason the core and fundamental uses on the devices should all of a sudden, chug, and stutter out with new software versions. Running newer apps, etc, sure, but the core items like browsers, etc should not go from smooth to stutter slowdown fest like they do. It is to the point where the experience is no longer fun, and starts to feel like it's broken or dying. The solution, newer phone/pad or warranty breaking jailbreak since they remove the check ins fast now to prevent rolling back.
 

Octavia

Unconfirmed Member
I turn all that stuff off, it just tanks fps even on a 970.

Whether all the theory stuff is true or not, I don't really know. What I do know is that I cannot think of a single instance in my entire Nvidia gaming career where they announce a new driver that increases performance for select titles or across the board, I do the pre-bench, clean update everything/restart, do the post-bench, and I actually get that stated increase. In fact, my frames almost always decrease by about 2-10%. It's gotten to the point where I'm afraid to update drivers anymore (which is funny considering how much AMD gets ripped on for that.)

So there's that for my anecdote.
 

diablos991

Can’t stump the diablos
I buy and play PC games on release. No way in hell would I get an AMD.

I just sell and upgrade my NVIDIA card every year. They hold their value decently.

Can't deal with the AMD jank in the release window of big new titles.
 

x3sphere

Member
Eh, the same thing happened on AMD cards in prior generations. I remember when an all-new series came out the older one would basically just stop receiving optimizations, or get minimal support.

Outside of the Fiji/Fury cards AMD hasn't changed its architecture in quite awhile though. When the whole line gets refreshed with Polaris GPUs I'd expect the same to happen with 290X/390X...
 
While some of the examples like the infamous Crysis 2 tessellation myth and having screencaps to differentiate how hairworks curbs performance when the solution inherently is to drive hair physics forward more than detail, does show how misinformed the video uploader is, I'm not going to deny nVidia practicing some shady stuff in the recent past - moreso after their exit from the console SoC space which AMD dominates completely now, and nextgen as well. The aggressive surge of GameWorks equipped high profile releases in the last 3 years speak for themselves, let alone their performance.

The reason black box API's exist are to push proprietary software/hardware forward, period. While Middleware initiatives like GPUOpen are indeed commendable for being open source, the proliferation of it really comes down to how much money the hardware vendor is ready to shell out for a game to use it. It's highly disingenuous to entirely blame nVidia here and not game publishers/developers equally as well. Let's be honest here, pubs/devs wouldn't bat an eye for it until you shower them with money.

That said, do we even have a rough or the slightest estimate on how much (in millions) do these hotly anticipated new titles get offered by nVidia to use GameWorks?
 
Eh, the same thing happened on AMD cards in prior generations. I remember when an all-new series came out the older one would basically just stop receiving optimizations, or get minimal support.

Outside of the Fiji/Fury cards AMD hasn't changed its architecture in quite awhile though. When the whole line gets refreshed with Polaris GPUs I'd expect the same to happen with 290X/390X...

What same thing? This video talks about deliberate attempt to reduce the performance of the previous generation.

One thing is to release drivers/patches that increase the frame rate of newer cards, another is to release a driver that decreases performance of the older cards. I do believe that in the case of Fallout 4 there was an error or misconfigurations between the game and the gpu.

I have a 770 GTX, GF drivers never seem to do any significant performance optimization. All I get is more processes from GF experience, Shadowplay is being held as a hostage in that useless software. At one point either Blizzard or Nvidia fucked HotS performance on GTX cards. All Gameworks effects are off by default, resource-hungry gimmicks, that is all that they are.

I still want to have at least two more years out of it, even if it means dropping details to medium or even low ;(
 

EdLin

Neo Member
So according to the video Nvidia nerfs their older cards.
AMD otoh stopped supporting their older cards.

AMD announced last year, as of 24 November 2015, the 5000, 6000, 7000 and 8000 series will be moved to the legacy support model category. Which means they will not get any new driver support anymore. Which is pretty bad, esp. for the (6000 and) 7000 series, since they still run games decently.

Nvidia otoh still has full driver support for cards dating all the way back to the 400 series. Very old cards like the 8000 series (E.g. GTX8800) got a new driver in November 2015.

Correction, only some of the 7000 series are no longer supported, the ones that weren't GCN. All GCN cards are still supported, like all the HD 79xx cards.
 
Wow this was an eye opener, some of you are really naive for thinking its not deliberately, all these companies care about is making money and not our best interest. Even apple is doing this shit, newer ios usually makes older devices run like crap.

I used AMD crimson to override the witcher 3 tessellation to 8x and the game is smoother. The visual difference is minor.
 
I wonder why everyone blames Nvidia exclusively when they provide the tech libraries and support while the developers of the game actually design and code the game and it's options...
 
While some of the examples like the infamous Crysis 2 tessellation myth and having screencaps to differentiate how hairworks curbs performance when the solution inherently is to drive hair physics forward more than detail, does show how misinformed the video uploader is, I'm not going to deny nVidia practicing some shady stuff in the recent past - moreso after their exit from the console SoC space which AMD dominates completely now, and nextgen as well. The aggressive surge of GameWorks equipped high profile releases in the last 3 years speak for themselves, let alone their performance.

The reason black box API's exist are to push proprietary software/hardware forward, period. While Middleware initiatives like GPUOpen are indeed commendable for being open source, the proliferation of it really comes down to how much money the hardware vendor is ready to shell out for a game to use it. It's highly disingenuous to entirely blame nVidia here and not game publishers/developers equally as well. Let's be honest here, pubs/devs wouldn't bat an eye for it until you shower them with money.

That said, do we even have a rough or the slightest estimate on how much (in millions) do these hotly anticipated new titles get offered by nVidia to use GameWorks?

The black box thing is also incorrect. If you license GameWorks from Nvidia, you can access the source code with a special contract. It works like Unreal Engine used to before it became free.

I expect GPUOpen to be ignored like everything else AMD has ever done. There's no incentive to use it because AMD just puts a bunch of spaghetti code out there and says here you go and there's no support, improvement, or game integration assistance ever. Meanwhile GameWorks is a real middleware, you license it and you get Nvidia engineers working with you to help integrate it into your game and if you want to you can get the source code. GameWorks is already supported as a component of Unreal Engine 4 so if you're developing on UE4, you just drop whatever GW library you want into your game and off you go.

I don't even really think Nvidia needs to pay anybody to use GameWorks. Nvidia just says, here's some middleware that lets you implement GPU physics, lighting, hair, and a bunch of other cool things in your PC port. You just sign here on the dotted line and we'll send some engineers over to help you put it in your game. The dev isn't going to say no to that, and PC games get features over their console brethren which improves sales. It's a win-win for everyone except AMD.
 

wonzo

Banned
4B5MkRw.png
 
This happens with Apple and iOS too.

There is no reason the core and fundamental uses on the devices should all of a sudden, chug, and stutter out with new software versions. Running newer apps, etc, sure, but the core items like browsers, etc should not go from smooth to stutter slowdown fest like they do. It is to the point where the experience is no longer fun, and starts to feel like it's broken or dying. The solution, newer phone/pad or warranty breaking jailbreak since they remove the check ins fast now to prevent rolling back.
Wow this was an eye opener, some of you are really naive for thinking its not deliberately, all these companies care about is making money and not our best interest. Even apple is doing this shit, newer ios usually makes older devices run like crap.

I used AMD crimson to override the witcher 3 tessellation to 8x and the game is smoother. The visual difference is minor.
Both my old Nexus 5 and Droid maxx have been broken by Marshmallow, meanwhile my iPhone 6+ and iPad mini retina are still zipping on IOS 9. Don't know what you guys are talking about.
 

Mozendo

Member
Both my old Nexus 5 and Droid maxx have been broken by Marshmallow, meanwhile my iPhone 6+ and iPad mini retina are still zipping on IOS 9. Don't know what you guys are talking about.

Wish I could say the same about my iPod Touch 4G.
Bought it the day it came out and loved it until iOS 6 was released and made it the laggiest piece of poo device I've ever owned and even decreased the battery life.
 
the original Fallout 4 Nvidia GeForce guide which got taken down almost immediately suuuuuure as hell made me think that way.

i've never seen a tweak guide fellate a graphical feature so hard. implying that there was a perceptible difference between godrays settings and that the performance tradeoff was worth it... the author does some damn good work but you and nvidia fucked up bigtime there, man, that shit was transparent to the utmost degree.
 
I always thought the exclusive geforce options in video games with Gamework were "funny", because they are always performance hogs, it's clear Nvidia is trying to push people to buy the $600 models of their gpus with these options, instead of making something that looks nice and it's efficient.
 
I always thought the exclusive geforce options in video games with Gamework were "funny", because they are always performance hogs, it's clear Nvidia is trying to push people to buy the $600 models of their gpus with these options, instead of making something that looks nice and it's efficient.

Only some gamework things are performance intensive. Hairworks was in Witcher 3, Fog was in Batman AK, but there's numerous effects under the GW umbrella that are not unjustified performance hogs and some of them are quite good. HBAO+ is pretty much the poster child for an inexpensive, good looking effect from GW. TXAA is a high quality AA technique of similar performance cost to MSAA, not cheap but respectable as long as the implementation is good. PhysX has had good and bad implementations since its inception, and I'm a fan in some games.

A large percentage of high profile game releases on PC are using gameworks tech every year, and the developers are ultimately responsible for the inclusion of the effects in the game. Nvidia moneyhatting them to deliberately include garbage implementations in their games would not be the kind of thing that you can keep a lid on. Bethesda, Rockstar, numerous Ubisoft studios, Eidos, Warner Brothers, these aren't indies. At some point these studios have been and remain convinced that it's worth the inclusion.

Another way to phrase "Nvidia is deliberately building effects that don't run well on low end cards" would be "Nvidia is building effects to take advantage of their highest end cards". Which suddenly sounds a lot less sinister. It's only a problem if you have the kind of mental compulsion to upgrade as soon as you have to disable one effect in the menu to get good performance. And I know we have people like that who have borderline mental breakdowns when they discover the card they bought 8 months ago doesn't max every single setting at their preferred framerate and resolution.
 

patapuf

Member
I always thought the exclusive geforce options in video games with Gamework were "funny", because they are always performance hogs, it's clear Nvidia is trying to push people to buy the $600 models of their gpus with these options, instead of making something that looks nice and it's efficient.

Where but in the high end graphic cards would test new/aditional graphical features?

It happens in literally every technology driven industry. You don't introduce new/aditional technology for the low end models first. Stuff like physics, hair, tessellation, alternative AA techniques ect. are never going to be cheap in their first iteration and in some cases ever. The only reason some are commonplace now is because the baseline is higher and these formerly "expensive" features are not seen as expensive anymore.

Most "Efficient" graphical techniques are used by the original gamedevs anyway. especially when it's console ports.

Another way to phrase "Nvidia is deliberately building effects that don't run well on low end cards" would be "Nvidia is building effects to take advantage of their highest end cards". Which suddenly sounds a lot less sinister. It's only a problem if you have the kind of mental compulsion to upgrade as soon as you have to disable one effect in the menu to get good performance. And I know we have people like that who have borderline mental breakdowns when they discover the card they bought 8 months ago doesn't max every single setting at their preferred framerate and resolution.

yep.
 

Durante

Member
So there's that for my anecdote.
That's an anecdote.

Scientific measurements, as already posted in this thread, show that it's simply not the reality:
UUoY1nv.jpg


Computerbase also do driver version reviews from time to time, and the overall performance trend is always flat or up, both on new and old GPUs. Sure, you might get individual decreases in one particular game with one particular version, but that's just the nature of software development on a complex project -- and GPU drivers are some of the most complex software out there.
 

nubbe

Member
People have started to observe this phenomena with Nvidia
and it wouldn't surprise me since Nvidia is a scumbag company
 

WolvenOne

Member
Yeah, no real surprise here. To be clear, nVidia cards are definitely good, their engineers are clearly very talented. However the company itself is unscrupulous as heck.
 

DeepEnigma

Gold Member
Both my old Nexus 5 and Droid maxx have been broken by Marshmallow, meanwhile my iPhone 6+ and iPad mini retina are still zipping on IOS 9. Don't know what you guys are talking about.

My iPad 3 and 4s phone stutters, chugs, and is no longer smooth with 8 and above. Go to Apple and other forums. This is well known with their software on older phones/tablets.

You are comparing your experience with the newest iPhone, to the relation of this topic?

And nice Android mention, but Apple comparisons with the newest hardware is 'flawless'. /s

This is an issue for all software system as a service tied to hardware as well. They want you to constantly upgrade. It is the mantra of a disposable society.

If not, then why do they make it so now you can no longer roll back to the better running OS without jumping through hoops and risking breaking the device through jailbreaking? The writing is plain to see.
 

Bolivar687

Banned
That's always been the thing about Nvidia's features, most of them don't run particularly well even on their own high end and enthusiast tier cards, but so long as AMD's run at least 1 frame lower, and the benchmarks show a consistent performance advantage to Nvidia, it's worth it for them.

When I built my PC just over a year ago, the R9 290 was the best value card on the market and playing games built with Mantle, Gaming Evolved, or general AMD support was an eye opener, compared to how Nvidia sponsored games always ran on my Nvidia cards. It just seems like the way sponsored middle ware should run, games looking and performing great and getting every bit of juice out of their cards. I bought a second 290 for Crossfire and have been playing Battlefield 4 and it's just stunning to play such a terrific looking game at a near locked 144fps.

This shouldn't take away from Nvidia's accomplishments, the experience was always hassle free for me and I think they deserve a lot of credit for shifting the GPU conversation more towards efficiency and frametimes rather than just raw power. But at the same time, AMD has done a lot with introducing GDDR5, now HBM, low level APIs and async compute. I'm going to weigh all factors in the incoming Polaris/Pascal war but all of Nvidia's shadiness and generally higher prices are going to play a factor into it. And at this point, I'm just so acclimated to tweaking games with Crimson and Radeon Pro that I'd just feel a lot more comfortable staying with them rather than supporting everything Nvidia has been trying to do.
 

RedSwirl

Junior Member
Only some gamework things are performance intensive. Hairworks was in Witcher 3, Fog was in Batman AK, but there's numerous effects under the GW umbrella that are not unjustified performance hogs and some of them are quite good. HBAO+ is pretty much the poster child for an inexpensive, good looking effect from GW. TXAA is a high quality AA technique of similar performance cost to MSAA, not cheap but respectable as long as the implementation is good. PhysX has had good and bad implementations since its inception, and I'm a fan in some games.

A large percentage of high profile game releases on PC are using gameworks tech every year, and the developers are ultimately responsible for the inclusion of the effects in the game. Nvidia moneyhatting them to deliberately include garbage implementations in their games would not be the kind of thing that you can keep a lid on. Bethesda, Rockstar, numerous Ubisoft studios, Eidos, Warner Brothers, these aren't indies. At some point these studios have been and remain convinced that it's worth the inclusion.

Another way to phrase "Nvidia is deliberately building effects that don't run well on low end cards" would be "Nvidia is building effects to take advantage of their highest end cards". Which suddenly sounds a lot less sinister. It's only a problem if you have the kind of mental compulsion to upgrade as soon as you have to disable one effect in the menu to get good performance. And I know we have people like that who have borderline mental breakdowns when they discover the card they bought 8 months ago doesn't max every single setting at their preferred framerate and resolution.

This has always been my feeling (though I never had a good feeling for how expensive HBAO+ and TXAA are).

I'm guessing all this concern is coming from people who try to turn up features like hairworks or fancy fog or tessellation or whatever. My view is, if you're not running the game on a $600+ Nvidia card, those features aren't for you. Those features aren't for every PC gamer, just let hairworks go. I don't even think they have that huge an effect on games, but then again I'm a person who can live with almost no anti-aliasing in most games. Personally I'm just satisfied that I'm getting roughly the same or better performance than a console, though I would like to run most games in 1080p/60fps on a Pascal card later this year. Maybe not with all the features cranked up, but at least with respectable visuals.
 
jesus, and i just bought a 970 ...

You and me both. This is pretty shitty of Nvidia but as long as the 970 is still getting 1080p 60fps at High settings on popular current games like GTA V, I can't see too many people caring.

If I simply choose not to update my drivers for my Nvidia GPU, will I be able to avoid these driver updates that reduce my performance in the games I play?
 

Jin

Member
I don't understand the hatred for Gameworks. The past couple of games I've played you can turn them off or low. PCSS, HBAO+, Tessellation, Godrays, Hairworks/Fur, etc. When TR2013 came out TressFX ran like shit on my 690. I turned it off and still enjoyed the game. When Ubisoft broke PCSS on Fary Cry 4 I turn it off and continue playing. I really don't get the outrage over optional settings. I sometimes turn off Hairworks in Witcher 3 just to push my FPS higher.

Speaking of Gameworks - where is AMD's own gameworks? What exactly is Nvidia doing to get devs on board that AMD can't?
 

WolvenOne

Member
I don't understand the hatred for Gameworks. The past couple of games I've played you can turn them off or low. PCSS, HBAO+, Tessellation, Godrays, Hairworks/Fur, etc. When TR2013 came out TressFX ran like shit on my 690. I turned it off and still enjoyed the game. When Ubisoft broke PCSS on Fary Cry 4 I turn it off and continue playing. I really don't get the outrage over optional settings. I sometimes turn off Hairworks in Witcher 3 just to push my FPS higher.

Speaking of Gameworks - where is AMD's own gameworks? What exactly is Nvidia doing to get devs on board that AMD can't?

I think some of the problem is that when you turn some of these features up to high, it ratchets the effects upto an absurd level that has no real discernable effect on visuals. However if you turn them down to low, the effects are all but turned off. The fact that on an AMD card I have to limit the tessellation to 16X or 8X if I want to enjoy discernable tessellation at all without taking a huge performance hit is an issue.

Basically, there's no real middle ground setting for some of these effects in Gameworks titles, and the only real reason for that is that they know it'll hurt competition and older cards.

To be fair, yes AMD could be doing more to combat this, but given their market share and financial troubles that'll be difficult. Not impossible, just difficult. The best thing AMD could be doing is simply focusing on making better cards that take less of a performance hit in cases of extreme tessellation. I mean, it's kinda silly that they have to seeing as essentially nobody needs 64X tessellation, but as long as that's what nVidia is pushing it's a condition they need to be prepared to handle.
 
Ive used Nvidia from about 2000 right up until 2013 when I got my 7870XT. To be honest I don't get half the shit people say. Other than their CPU overhead I've been very happy with them. Even that wasn't an issue until my 2400 i5 started showing it's age. My biggest issue with Nvidia is their pricing. The 970 is still pretty much the same price as a year ago, but there is no incentive to lower it. I hope AMDs next cards knock it out of the park.
 
Sorry for the bump, but I feel compelled to voice my outrage towards nvidia as a Keplar owner. With new cards on the horizon more people need to be aware of these shenanigans.
 
Sorry for the bump, but I feel compelled to voice my outrage towards nvidia as a Keplar owner. With new cards on the horizon more people need to be aware of these shenanigans.

Mot likely to happen since pascal is virtually identical to maxwell. Its not much more than a die shrink
 

DonMigs85

Member
Mot likely to happen since pascal is virtually identical to maxwell. Its not much more than a die shrink
There's quite a few enhancements actually. Check out Anandtech's GTX 1080 preview. But yeah, Fermi to Kepler and Kepler to Maxwell were bigger changes.
Another thing is they've sold so many GTX 970 and 950 cards that there's probably gonna be even more outrage than before if they do obviously nerf them after Pascal is out.
 
There's quite a few enhancements actually. Check out Anandtech's GTX 1080 preview. But yeah, Fermi to Kepler and Kepler to Maxwell were bigger changes.

The only documented hardware changes are smp which is meaningless to single monitor gaming, the upgrade to CR tier 2, and dynamic load balancing(i have doubts on just how much of this particular feature is even hardware based). The memory compressison isnt really applicable here.
 

K.Jack

Knowledge is power, guard it well
Still waiting for real, scientific proof of this.

Posts like this go ignored:

That's an anecdote.

Scientific measurements, as already posted in this thread, show that it's simply not the reality:
UUoY1nv.jpg


Computerbase also do driver version reviews from time to time, and the overall performance trend is always flat or up, both on new and old GPUs. Sure, you might get individual decreases in one particular game with one particular version, but that's just the nature of software development on a complex project -- and GPU drivers are some of the most complex software out there.
 
Top Bottom