• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia's Gameworks nerfing AMD cards and its own previous-gen GPU's?

Swarna

Member
https://www.youtube.com/watch?v=O7fA_JC_R5s

Unnecessary tessellation, 780 being outperformed by the 960, Nvidia cards dropping performance post-launch and more. What do you make of this?

I gotta say being new to NV with my 970 I certainly don't like the idea of performance dropping after the next generation is released. They already cheaped out on the VRAM...
 

Skux

Member
Saw this on PCMR a few days ago.

I'd say it's deliberate. The Fallout 4 example makes it plain as day. It's planned obsolescence in action. You can bet that Pascal will do the same to Maxwell.
 

UnrealEck

Member
I've seen this video before. By someone else. Maybe a year or two ago.
The tessellation in Crysis 2 is really suspect. Some of the tessellation in games and programs looks ridiculous too. To the point of excess.
 
I agree that they do purposely push unneccessary amounts of tessellation when the same effect or very close to it could be accomplished with far fewer polygons simply because it hurts amd a lot more. I dont think keplers poor performance is a planned initiative tho. Its just an architecture poorly suited to the type of rendering the majority of developers are pursuing. I expect maxwell to be in a similar situation in a year or so, just not as extreme.
 

ponpo

( ≖‿≖)
That Witcher 3 hair example is seemingly damning, or at least illustrative of some pointless effects.
 

Trojita

Rapid Response Threadmaker
This shit should be illegal. I shudder about the possibility of a Nvidia only market.
 

Zaptruder

Banned
Jumping back on the AMD bandwagon after my 980Ti.

Fuck Nvidia man... I don't want to be stuck holding a card with shit performance after next generation just because of Nvidia's (fairly veiled) attempts at planned obsolecence.

Can't watch right now, can someone please summarize?

1. Nvidia aggressively works with developers to incorporate their software into their games. Prioritize development on Nvidia cards.
2. Works great when pairing launch titles with their newest cards.
3. And will make their newest cards look better than AMD's stuff.
4. But only because they're working with developers in a way so as to exploit performance differentials between Nvidia/AMD, to no benefit of the user (i.e. imperceptible differences in image quality). Meaning they're willingly taking an unnecessary hit in performance with these heavily marketed effects... so that AMD will take an even bigger hit.
5. Insidiously - they'll keep updating these effects so that they only work well on their latest generation of cards, and purposefully design the drivers to be inefficient with their older cards (treating their older cards like AMD essentially). Meaning that over time, your effective performance with Nvidia cards will be tanked, so as to motivate you to upgrade to new Nvidia cards.

In otherwords, they're manipulating and gaming the way the industry produces games, and reviews hardware to make their newest stuff look better then they are... and make AMD AND their older cards look worse then they are.
 
This is seriously damning of Nvidia. Great investigative journalism. Kudos to the video maker.

I think I will be going with AMD in my planned build to show my discontent with the situation. Even if I will deal with lower framerates, I can't support practices that are designed to work against the consumer but are attempted to be passed of as a benefit.
 

WaterAstro

Member
I already jumped off Nvidia for my newest card.

They're so shady, trying to monopolize the market. Too bad the average people won't know about their practices.
 

Bl@de

Member
Yeah something isn't right here. When I bought the 770 it was always +/- 3% with the 280X. Now that some time passed I see more and more benchmarks with the 280X @ +10% (for example: RoTR +10% min.FPS)
 
I remember hearing about this around the time Crysis 2 was out, something like tesselated water being unnecessarily rendered beneath the map which eventually had AMD release the Tesselation option which ended up helping out now like in the case of The Witcher 3. I love my 980ti for downsampling but I miss AMD's superior Anisotropy.

Edit: Welp, watched video, should've done that first.
 
I though this was pretty well known after the Witcher 3 release, e.g. here or here

That said IIRC the Crysis 2 example is disingenuous, it doesn't use that much tessellation in game.
 

Ragona

Member
I've read that turning tesselation down in the crimson menu helps atleast abit. Can anybody confirm that?
Guess devs just go the way of least resistance.
 

schuey7

Member
Yup , nothing new .The difference between my laptop with nvidia card and desktop with amd card has increased an alarming amount with each new game .I know that my next desktop purchase will be amd for this very reason.My 7950 is still being optimised for and I expext amd to keep this up.
 

thuway

Member
Honestly this garbage won't stop unless gamers actually do something and stop purchasing Nvidia cards. The problem is- as a 280X owner here- we've been shafted on the AMD front for a very long time too. Also - the performance Nvidia offers per watt is unmatched. Alongside all the "exclusive" features, its Nvidia's game unless AMD gets aggressive, and that's not happening any time soon.

In reality, I don't think there is any legal course of actions gamers can take- is there?
 
I noticed this with my 780 it was doing increasingly poorly I thought I was going crazy. Then I read a while ago that they were fucking their old cards on purpose.

I'm glad I've gone to AMD. If you get a freesync monitor you don't have to worry too much about Nvidia trying to fuck with performance.
 
The issue here is both actually smaller AND more widespread.

What happens is that optimization is only done for newer hardware, as new tier hardware comes out, Nvidia simply puts the time to optimize for the biggest seller at that point.

Right now the biggest seller is the 970, so all driver development and optimization goes there, at the expense of previous generation 7XX. When DX12 actually becomes more than vaporware you can expect the current 9XX to perform on par or WORSE than even consoles. I've said this before and people won't believe it, but it won't stop it from happening.

But the big problem here is that AMD does pretty much the same.
 

Vuze

Member
Has somebody reverse-engineered the drivers to proof this independently of benchmarks? Would just be interesting what exactly they do to tamper with the performance of older cards.
 

Renekton

Member
Half-expecting some replies here to be:

"I hope AMD does well because I want Nvidia cards, which I exclusively buy, to be cheaper."
 
Wow, this is insane.

Like I don't want to even touch Nvidia but how can you fight this? Just buying an AMD card and suffering with the lower performance?

Edit: Actually after doing some research it seems like most of this guy's arguments are bullshit. Looks like he might just be trying to stir up drama :/
 

KHlover

Banned
AMD better become competitive enough to do something about this. Doesn't sound good for the resale value of my GPU when I'll inevitably buy the next one down the road.
 

DonMigs85

Member
Hopefully this gains traction with the various tech sites like Anandtech. Toms Hardware and others, unless Nvidia buys them off. I'd like to see Jen-Hsun Huang make an official statement.
 

AHindD

Member
This shit isn't new, Nvidia has been a blight on the gaming community for some time now, yet people keep buying their anti-competitive shit.

The people turning around and going "Well thats it, I'm going AMD after my 980Ti" are the best.

Anyway, the last Nvidia card I had was a 6600GT, and it looks like it'll stay that way.
 

Renekton

Member
Hopefully this gains traction with the various tech sites like Anandtech. Toms Hardware and others, unless Nvidia buys them off. I'd like to see Jen-Hsun Huang make an official statement.
They have no reason to respond, they are a monopoly now and nobody cares.

At worst, if this accusation is true, they will pull a GTX970 again with "we did it for you and you are not appreciative"
 

Zaru

Member
Soooo.... are people going to stup buying Nvidia cards now?
Obviously not. Which means they won't even react.
 
Yeah something isn't right here. When I bought the 770 it was always +/- 3% with the 280X. Now that some time passed I see more and more benchmarks with the 280X @ +10% (for example: RoTR +10% min.FPS)

its much more than 10%

may 2013
perfrel_1920.gif


jan 2016
perfrel_1920_1080.png
 

Locuza

Member
Only the Crysis thing, the rest is still true (e.g overtessellating HairWorks)
What was the reason behind the Crysis 2 case?

And HairWorks is not overtessellated since the technical solution is based on it, it needs a high tess-factor to look good.
That this kind of approach might be not the best idea is another story.

Has somebody reverse-engineered the drivers to proof this independently of benchmarks? Would just be interesting what exactly they do to tamper with the performance of older cards.
It's very unlikely that Nvidia tanks their own cards.
But the cases are of course annoying when they appear.

Like I don't want to even touch Nvidia but how can you fight this? Just buying an AMD card and suffering with the lower performance?
For the average man/woman there is nothing to do.

I expect nothing to change unless AMD gets its ass in gear.
Actually AMD got it's ass in the gear, but the gear needs a "little" bit of time.
Aside from that I'm very happy that AMDs work at several points is coming to fruition.
 
I knew my next upgrade would be an AMD card, but this just strengthens my resolve to not put money on a 750 ti as a stop gap measure.
 

HowZatOZ

Banned
If it is happening I'm definitely not noticing it on my 770, but then again I don't really notice much. If what's in the OP is true that's some dirty shit, and I'll not be supporting them come HBM2 cards.
 

Zaptruder

Banned
The people turning around and going "Well thats it, I'm going AMD after my 980Ti" are the best.

The only reason I have the 980Ti is because AMD were dragging their feet with the Fury Pro release at the time and my old computer died, forcing my hand in upgrading and building the new comp (which would've happened anyway, just a month sooner than I had planned). Figured I'd give Nvidia a shot again, see if it's all that or I was missing anything. No, not really!
 

DonMigs85

Member
They have no reason to respond, they are a monopoly now and nobody cares.

At worst, if this accusation is true, they will pull a GTX970 again with "we did it for you and you are not appreciative"
Sadly, that's probably true. Even Intel didn't get a whole lot of bad publicity after they were forced to pay AMD for their anti-competitive practices. I think their x86 compiler also still penalizes non-Intel CPUs.
 
Has somebody reverse-engineered the drivers to proof this independently of benchmarks? Would just be interesting what exactly they do to tamper with the performance of older cards.

This is the problem. There's no real "deliberate tampering".

What happens is simply that optimization focuses on new hardware, so as more time passes the old cards perform worse because they still use legacy code that performs worse.
 
You guys probably missed the post showing Nvidia cards having low image quality compared to an AMD card with the exact same settings. There was a post exposing that months ago. I forget which Nvidia cards they compared.
 

Zaptruder

Banned
This is the problem. There's no real "deliberate tampering".

What happens is simply that optimization focuses on new hardware, so as more time passes the old cards perform worse because they still use legacy code that performs worse.

Of course, this isn't necessarily an unexamined externality. It could be just as much by design as from neglect.

Certainly works well in favour of Nvidia's marketing machine anyway.
 

Apt101

Member
Hmm. Looks mighty suspect. I purchased a 970 last year. I forget exactly what I paid but it was more than $300. I don't like the idea of that card being made even slightly obsolete for 1080p gaming within the next three years.

I don't know if I want to switch to AMD anytime soon either.
 
The issue is that the video is misleading.

It's both worse and better. Better because it's not really "malice" deliberately slowing down hardware.

And it's worse because it's simply not limited to Gameworks, but it's widespread about EVERYTHING in the drivers. So it's going to affect every game out there and not just a few specific features. And it will become bigger and bigger as time passes.

This means that the wrong basis people are using to criticize the issue are going to be used by Nvidia to prove everything false. Yet there are more serious issues that are instead valid, and those are going to be erased as well.
 
Top Bottom