• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD vs. Nvidia GameWorks in Witcher 3

scitek

Member
I don't know about the physics and the other stuff in The Witcher 3, but here's a 30 second benchmark comparison I did of TressFX in Tomb Raider on my Nvidia card (970), and Hairworks in The Witcher 3.

Tomb Raider

TressFX ON
Min. 72
Max. 88
Avg 78.9

TressFX OFF
Min, 83
Max 103
Avg 91.467



The Witcher 3

Hairworks OFF
Min 58
Max 74
Avg 67.4

Hairworks Geralt Only
Min 55
Max 68
Avg 61.2

Hairworks All
Min 54
Max 63
Avg 58.233

Anecdotal evidence, obviously, but just wanted to share.

Both games were in rainy conditions (if that matters), and I ran the exact same path for 30 seconds each time to make it as consistent as possible.
 

QaaQer

Member
Nvidia has a history of lying to partners and consumers. They engage in anticompetitive tactics. And yet, defenders abound.

Although I seem to remember someone linked to a pod cast where some of the Nvidia community's guys (paid employees) we really pumped and ready to spread the word on message boards just prior to the 900 series release. Anyone remember that podcast?
 

Skyzard

Banned
Holy crap. I almost get that on a r9 290, except with everything on ultra (~35 fps).

Yeah, Kepler is busted on this game. However, it should also be noted that while kepler performance in driver updates has remained static for a while, R9 performance has increased quite a bit since launch.

I've read people say that. I hope that's not the case that Nvidia is a super slimey piece of shit.

The 38fps lock is for no drops at all though, I can get closer to high 40s/50 fps most of the time but I don't want a fluctuating or stuttering image - some places are poorly optimised compared to the rest of the game such as part of the abandoned village in White Orchard.

Please stop trying to pass a turd as something that is ok. Enable hairworks and then see what is really the deal here: You're still getting fucked.

Hairworks is the turd, nvidia haven't made it playable for 99% of people. Same with FC4, and that has other busted gameworks.

You don't think this has anything to do with it? 1440p doesn't run much better on my 970.

I can't bring myself to drop it, vegetation looks so bad to me at 1080p. Just a mush!
 

QaaQer

Member
You haven't even bothered to address any of my points but you did a nice job to slip in a complaint about some "passive-aggressive" tone which you probably imagined since nothing in my post is intended to be insulting. My post stands exactly as I have written it and requires no further explanation. You can take it or leave it.



The lock-in features are the very definition of competition. You know what's anti-competitive? Forcing a company to open up proprietary technologies so another company which is losing can compete better. That's literally the opposite of how competition works. Nvidia invented it, and they aren't sharing. You keep what you invent. You invented it, it belongs to you, and you do what you want with it. That's competitioabout

Many people have explained many times why Intel's top-end CPU performance has not iterated dramatically for the past 5-6 years. I've explained it myself multiple times on GAF. I'm tired of it, figure out why yourself.

You really don't know what you are talking about as that happens all the time in order to preserve competition: essential patents and patent expiration. So you'll excuse people for not paying attention to your analysis of the CPU market.

It is hard to preserve competition, especially in tech fields. Unregulated markets tend towards monopoly, which is, by definition, competition free.
 

OmegaDL50

Member
As an AMD owner, anytime I see that Nvidia logo on boot up I know there's some setting that's gonna tank my performance to sub 30fps.

Ironically Saints Row the Third boots up with that "AMD Gaming Evolved" and it causes my GPU fan to scream and reach temps of 71c, yet Witcher 3 runs super quiet and runs exceptional despite being a Nvidia optimized game.

I have an HD7950 too, Saints Row 3 was one of those free game offers that came with my GPU, so to have it make my GPU act weird is very strange all in all.

You would think The Witcher 3 would be much more demanding and stress my card more, but it doesn't. I say this more about SR3 having shit optimization then anything else.
 

Condom

Member
The lock-in features are the very definition of competition. You know what's anti-competitive? Forcing a company to open up proprietary technologies so another company which is losing can compete better. That's literally the opposite of how competition works. Nvidia invented it, and they aren't sharing. You keep what you invent. You invented it, it belongs to you, and you do what you want with it. That's competition.
You obviously have a different definition of competition than the regular norm.

In your world MS, Apple and Google can do everything they want in the name of competition. Even more, if I'm competing with you for a job, I am allowed to smash your head in just competition harr harr.

Come on, we have regulation on this kind of stuff and for a reason. Let your products compete, not your dollars.
 

knerl

Member
I don't know about the physics and the other stuff in The Witcher 3, but here's a 30 second benchmark comparison I did of TressFX in Tomb Raider on my Nvidia card (970), and Hairworks in The Witcher 3.

Tomb Raider

TressFX ON
Min. 72
Max. 88
Avg 78.9

TressFX OFF
Min, 83
Max 103
Avg 91.467



The Witcher 3

Hairworks OFF
Min 58
Max 74
Avg 67.4

Hairworks Geralt Only
Min 55
Max 68
Avg 61.2

Hairworks All
Min 54
Max 63
Avg 58.233

Anecdotal evidence, obviously, but just wanted to share.

Both games were in rainy conditions (if that matters), and I ran the exact same path for 30 seconds each time to make it as consistent as possible.

What CPU do you have?
I have a 970 G1 Gaming running @1380MHz and a i5 2500K @ 4.3GHz and I don't get 54fps min. Neither are you to be honest. If you get up close to Geralt with hairworks on it can drop to 24fps. So unless your CPU gives you a huge boost I don't know. Also. You don't mention what settings you're using in general. So I'll give you that before I start pointing fingers. Which drivers are you running? I'm using 352.86
 

Gojeran

Member
TressFx certainly does not run great on my 970 in TR2013 and HairWorks runs like shit as well in the Witcher 3. I don't know what Amd is on about but less complaints against nvidia and more research and development might help them win more market share than crying on Twitter.
 

Crisium

Member
http://www.extremetech.com/extreme/...surps-power-from-developers-end-users-and-amd

And this is from a year and a half ago, so consumers could have discouraged this.

According to Nvidia, developers can, under certain licensing circumstances, gain access to (and optimize) the GameWorks code, but cannot share that code with AMD for optimization purposes.

because AMD can’t examine or optimize the shader code, there’s no way of knowing what performance could look like. In a situation where neither the developer nor AMD ever has access to the shader code to start with, this is a valid point. Arkham Origins offers an equal performance hit to the GTX 770 and the R9 290X, but control of AMD’s performance in these features no longer rests with AMD’s driver team — it’s sitting with Nvidia.

You can read for yourself. It doesn't get any more official than this link here:

https://developer.nvidia.com/gameworks-sdk-eula

"NVIDIA GameWorks SDK" means the set of instructions for computers, in executable form only and in any media (which may include diskette, CD-ROM, downloadable internet, hardware, or firmware) comprising NVIDIA's proprietary Software Development Kit and related media and printed materials, including reference guides, documentation, and other manuals, installation routines and support files, libraries, sample art files and assets, tools, support utilities and any subsequent updates or adaptations provided by NVIDIA, whether with this installation or as separately downloaded (unless containing their own separate license terms and conditions).

2. "In addition, you may not and shall not permit others to:

I. modify, reproduce, de-compile, reverse engineer or translate the NVIDIA GameWorks SDK; or
II. distribute or transfer the NVIDIA GameWorks SDK other than as part of the NVIDIA GameWorks Application.

Any redistribution of the NVIDIA GameWorks SDK (in accordance with Section 2 above) or portions thereof must be subject to an end user license agreement including language that

a) prohibits the end user from modifying, reproducing, de-compiling, reverse engineering or translating the NVIDIA GameWorks SDK;
b) prohibits the end user from distributing or transferring the NVIDIA GameWorks SDK other than as part of the NVIDIA GameWorks Application;"

-----

Any Nvidia GW code inserted into the game cannot be modified, altered, optimized by the developer without Nvidia's written permission. That means if the developer uses a particular GW SDK for some effect such as tessellation or HBAO+ and the game runs poorly after, they either have to accept the performance hit or remove the SDK. It's take it or leave it.

Essentially what this means is a Black Box source code from Nvidia inside the game engine itself. Based on the EULA, that also means AMD (or Intel) cannot optimize the driver around Nvidia's GW code since it's closed and proprietary.

I recommend not encouraging this behavior.
 

Deadbeat

Banned
You obviously have a different definition of competition than the regular norm.

In your world MS, Apple and Google can do everything they want in the name of competition. Even more, if I'm competing with you for a job, I am allowed to smash your head in just competition harr harr.

Come on, we have regulation on this kind of stuff and for a reason. Let your products compete, not your dollars.
More like one person doing all the school work, but you are forced to share all the answers with other people at your table that have done none of the work.

This isnt school where the bottom end get propped up to pass. You pull your own weight or you go down in flames.
 

QaaQer

Member
More like one person doing all the school work, but you are forced to share all the answers with other people at your table that have done none of the work.

This isnt school where the bottom end get propped up to pass. You pull your own weight or you go down in flames.

Terrible analogy and total moralizing bullshit.
 

Pagusas

Elden Member
I'll always buy the gpu that gives me the most features/performance mix for the price. Right now that's nvidia. If amd can return to their glory days Of ATI (like the 9800pro), I'll jump back on board.

Overall though, I see amd as a lifeless husk of a company kept alive simply to give the appearance of competition. I wouldn't be suprised if Intel, nvidia and amd board members all sit in the same country club laughing at their "balanced" approach to keeping regulators at bay.
 

scitek

Member
What CPU do you have?
I have a 970 G1 Gaming running @1380MHz and a i5 2500K @ 4.3GHz and I don't get 54fps min. Neither are you to be honest. If you get up close to Geralt with hairworks on it can drop to 24fps. So unless your CPU gives you a huge boost I don't know. Also. You don't mention what settings you're using in general. So I'll give you that before I start pointing fingers. Which drivers are you running? I'm using 352.86

No, you're right, I didn't get up close in either benchmark, but both take significant hits when I do. Tomb Raider dropped to around 60fps, and The Witcher 3 was around 40. Those drops are the reason I keep Hairworks off in the latter. Neither is anything to brag about, though, especially for the minimal visual impact I think it has.

1080p all settings maxed in Tomb Raider

1080p all settings on Ultra EXCEPT Shadows and Foliage Distance at High in The Witcher 3


i5 2500k @ 4.5GHz
GTX 970 factory overclocked on latest drivers
 

Crisium

Member
I'll always buy the gpu that gives me the most features/performance mix for the price. Right now that's nvidia. If amd can return to their glory days Of ATI (like the 9800pro), I'll jump back on board.

Overall though, I see amd as a lifeless husk of a company kept alive simply to give the appearance of competition.

In the CPU arena that's all AMD is, but for GPUs you are misinformed if you think AMD hasn't been competitive with Nvidia at every level ever since the 4870 launched in summer 2008.

They had the power consumption advantage from 2008 to 2012, VRAM advantage from 2008 until 2015, and price advantage always except ocassional 1 month spans where Nvidia undercuts with a new launch (like where 680 undercut the 7970... but jokes on consumers as it turns out the 7970 aged far better and was worth more money).

Hawaii is really pushing power consumption, but the price is right and it still has the full 4GB instead of the 970's partitions. And since the Omega drivers especially AMD has almost every features of Nvidia (Physx is dead), except those locked out in propriety GW code of course. 290 series still offers equal or better performance than any 970 for less money on average, so I don't see a lifeless husk. How ridiculous. A lot of anti-AMD here seems to be ways to rationalize that they bought their GPU based on feelings. At any price level, there is justification for either company but a lot of people pay more for Nvidia out of an emotional tie to the brands. A premium is something you pay, not something you get.
 

n0n44m

Member
Nvidia has a history of lying to partners and consumers. They engage in anticompetitive tactics. And yet, defenders abound.

So has AMD, and they're even more whiny about it

This Huddy guy lied in a direct interview to PCPer about TressFX code availability to Nvidia, as it turned out after the interview that contrary to his claims Nvidia didn't see any of it until release and was therefore unable to provide a driver.

Now he's here a year later accusing Nvidia of basically doing what they did themselves before ?

Talking about defenders, I'm frankly surprised people are willing to defend AMD after all the bullshit their PR spouts. Just look at something like the whole FreeSync stuff. It's all empty promises and trying to make Nvidia look like money grabbers, then throwing some half complete product out on the market so they can say they are on par feature wise with Nvidia.

At any price level, there is justification for either company but a lot of people pay more for Nvidia out of an emotional tie to the brands. A premium is something you pay, not something you get.

Premium features! Like 3D Vision, which leads to 120+ Hz gaming, which leads to G-Sync and ULMB. Or stuff like Shadowplay, all the HBAO and AA techniques, and of course a focus on smooth framepacing over outright FPS. Then there are the drivers which actually let you do stuff like downsampling or SGSSAA...

All thanks to Nvidia who actually try to come up with new stuff that improves PC gaming. And who then actually make sure this stuff is used by developers (where did stuff like True Audio go AMD?)

"emotional" heh.


---


Nvidia are smug bastards who want your money by eliminating the opposition

AMD are cheap bastards who want your money by convincing you that the opposition is evil

I doubt this tactic is working for AMD, but still let us hope AMD is more successful in future than they have been for the past few years because Nvidia will charge us Titan money for a downcut x60 Ti version of their cards if AMD steps out of the game ;(
 

NeOak

Member
While Nvidia was making moves into the ARM sector, AMD decided to make fucking desktop RAM which has crazy low profit margin.
Allow me to correct you:

NVIDIA only made their ARM CPU after they tried to get a x86 license from Intel with threats for their transmeta-like core that was the original project Denver. Intel sent them packing and they had to use ARM because otherwise they would have been sued until oblivion by Intel.

And its not like NVIDIA is having real success with them. Tegra was only used in the Zune HD. Tegra 2 was used a bit but wasn't what was promised in terms of thermals which is suicide in the embedded world. Tegra 3 was the Surface RT and Ouya lol. Tegra 4 only in the Surface 2 and Shield.

After that, it has been only the Shield produced by NVIDIA and out of nowhere Nexus 9. What a great investment lol.

As for the RAM, that was branding only because there was a market for it due to complete AMD platforms. If you see what they did, it was just so certain builders could offered "AMD systems".
 
I dont think that Nvidia sabottaged Hairworks for AMD. Its just that it relies heavily on tesselation, which Nvidia cards, especially Maxwell, are better at. Its also a reason why changing the tesselation settings in AMD CCC gives such a performances boost.

TressFX gave the edge to AMD on the other hand because it relied heavily on gpu compute, which AMD cards are better at.

If I get the whole situation right, at least...
 

LCGeek

formerly sane
Hawaii is really pushing power consumption, but the price is right and it still has the full 4GB instead of the 970's partitions. And since the Omega drivers especially AMD has almost every features of Nvidia (Physx is dead), except those locked out in propriety GW code of course. 290 series still offers equal or better performance than any 970 for less money on average, so I don't see a lifeless husk. How ridiculous. A lot of anti-AMD here seems to be ways to rationalize that they bought their GPU based on feelings. At any price level, there is justification for either company but a lot of people pay more for Nvidia out of an emotional tie to the brands. A premium is something you pay, not something you get.

Features isn't the problem its the bad software.

There's nothing on amd like nvidia inspector which is sad. Radeonpro despite being used by me doesn't count it can drop frames in newer gamers like GTA5 and on top isn't full compatible with DX11 tweaks being applied. Also amd sucks on DSR the fact I know my 7950 can do it from driver registry tweaks, which are annoying as hell but somehow can't like nvidia software is outright sad. Other than that they are somewhat pretty close but for enthusiasts or people who run in to compatibility problems one side is far less hassle than the other.

Reasons like these are why team green is still solid in the eyes of most.

AMD refuses to man up in this area and instead wants to gripe about nvidia while anyone whose studied the DX11 issue should be asking them if they are whining about nvidia here than why is the core of their drivers so shitty?
 

Braag

Member
I do hope that whatever the kepler fix is going to be, it's actually something useful.
Yeah I hope so too. It's weird how much better GTAV runs on my 770 4GB compared to Witcher 3 considering how much more is usually going on in that game.
The FPS drops to 25 or so at some points with hairworks off and couple of the settings like foliage range and shadows at high.
I'm not expecting to run hairworks with my card but I was expecting steady 30fps without having to compromise much in other visual aspects.
 

x3sphere

Member
Hard not to think of conspiracy theories when you see Kepler vs Maxwell performance numbers for Hairworks.

*removes fistfuls of salt from pockets*

Hrm, the poor Kepler performance doesn't seem to be due to GameWorks features

http://gamegpu.ru/images/remote/htt...e_Witcher_3_Wild_Hunt-game-new-2560_u_off.jpg

http://gamegpu.ru/images/remote/htt...2_Assassins_of_Kings-test-witcher3_2560_u.jpg

Looking at those benches, Kepler cards get a similar performance hit to Maxwell once you turn on GW features. It's the base performance that is bad relative to Maxwell. AMD base performance is good, but terrible once GW features are enabled, so whatever's affecting Kepler is a different thing.
 

gogogow

Member
Allow me to correct you:

And its not like NVIDIA is having real success with them. Tegra was only used in the Zune HD. Tegra 2 was used a bit but wasn't what was promised in terms of thermals which is suicide in the embedded world. Tegra 3 was the Surface RT and Ouya lol. Tegra 4 only in the Surface 2 and Shield.

After that, it has been only the Shield produced by NVIDIA and out of nowhere Nexus 9. What a great investment lol.

Actually, a lot more products used/uses Tegra 3, 4 and K1, like Toshiba, Acer, Asus and Xiaomi tablets, Xiaomi, LG and HTC phones.
http://www.nvidia.in/object/tegra-phones-tablets-in.html
http://www.eurogamer.net/articles/digitalfoundry-2014-xiaomi-mipad-review
 
Before AMD complains about how their driver doesn't work well with Nvidia designed codes, they need to actually make a stable, working driver for Linux and BSD.

If they can't get their driver to function normal desktop tasks, it's no wonder their drivers don't work well in games, even in ones that's not "Way it's meant to be played".

(Sorry for the rant. I need use OSes other than Windows sometimes and I got burned pretty hard by AMD drivers)
 

Crisium

Member
Premium features! Like 3D Vision, which leads to 120+ Hz gaming, which leads to G-Sync and ULMB. Or stuff like Shadowplay, all the HBAO and AA techniques, and of course a focus on smooth framepacing over outright FPS. Then there are the drivers which actually let you do stuff like downsampling or SGSSAA...

Overall, Nvidia has a bit of an advantage here. But I'm not sure if you are unaware of alternatives or purposely leave them out.

3D Vison = hd3d
G-Sync = Free Sync
Shadowplay = GVR
HBAO and AA techniques, are you referring to GW HBAO? TXAA is a nice option though.
You can downsample on AMD now: VSR

ULMB is nice as an alternative to G Sync, but it's one or the other sadly.

You can argue which feature is better, but generally both companies have a solution. If you are hardcore tied to a Nvidia feature and know it isn't as good on AMD, then that's one thing. But truly those are rare people. I would recommend Nvidia if someone was very dedicated to 3D gaming, personally. But look at out own PC recommendation thread, where almost no one ever mentions these features.

All thanks to Nvidia who actually try to come up with new stuff that improves PC gaming. And who then actually make sure this stuff is used by developers (where did stuff like True Audio go AMD?)

How exactly does Nvidia make PC gaming stronger? What makes PC gaming stronger is the developer focusing on making the game specifically for the PC, which hardly any of them do anymore. Taking a console port, adding GW features isn't proof that the developer is dedicated to PC gaming. RockStar reworking GTA V shows they actually did care. There are plenty of Nvidia GW titles which are 97% console ports through and through and no amount of TXAA or PhysX can hide it. Injecting code and forbidding devs from modifying it just hurts AMD, Intel, and Nvidia users (some people paid a lot of money for a 780 and want to keep it).

Are you going to say Nvidia buying Ageia Physx has helped the technology? By the very act of Nvidia acquiring Ageia, they have destroyed any chance of next generation advanced physics effects for all PC gamers. Since that acquisition, PhysX remains DOA except for Batman games and the BL franchise. Almost a decade has passed and so far we've hardly seen any progress with PhysX, thanks to its locked/proprietary nature that alienates Intel/AMD users. If Nvidia allowed AMD/Intel GPUs to run PhysX, PhysX would have exploded in games by now.

GW and PhysX are basically the exact same thing - it's a process of vendor lock to try to make your products look better than the competitors (and your own older products, Kepler) at the expense of everyone else in the market who doesn't own a new Nvidia product.
 

Crisium

Member
AMD refuses to man up in this area and instead wants to gripe about nvidia

I'm not ignoring the rest of your post, you made good points there. But this is a GW thread and I've seen people say AMD can do GW too, and should man up and do it and quit being jealous.

But do we want both companies doing this? GW goes everything against what PC gaming has stood for for decades. GPU hardware makers should never be allowed to alter or modify any game code in a game unless it's open source. AMD and Nvidia are hardware providers, not game developers.

PC game development has always revolved around benefiting the PC gaming community. That community involves anyone, Matrox, Intel, VIA, AMD, Nvidia, etc. Anyone who makes a GPU should have open source code and no one should be allowed to provide/insert proprietary source code into any game that adversely affects PC gamers that have other hardware.

GW is the antithesis of what PC gaming stands for. AMD should not join them on this path. It is abhorrent.

If AMD/NV start providing close-source proprietary alternative code paths in AAA games, and the developer isn't allowed to alter, manage, optimize ANY of that code (see the GW EULA which I linked to earlier), AMD/NV will have FULL control of obsoleting older generation of cards at will and we will be at their mercy. Look at Kepler now. Unless you want to upgrade every generation, this sounds like a horrible future, not to mention being forced to own both an NV and an AMD card in the same rig.

I don't think AMD is some altruistic holier-than-though entity, but compared to Nvidia right now? I'm glad they aren't screwing over consumers. Maybe that's manning up in their own way. LCGeek you're right that AMD has to go further on the software side, but Nvidia is actively and willingly harming consumers (even their own!) and I consider that worse than AMD's minor software blemishes.
 

NeOak

Member
Actually, a lot more products used/uses Tegra 3, 4 and K1, like Toshiba, Acer, Asus and Xiaomi tablets, Xiaomi, LG and HTC phones.
http://www.nvidia.in/object/tegra-phones-tablets-in.html
http://www.eurogamer.net/articles/digitalfoundry-2014-xiaomi-mipad-review

Phones? One.

Tablets? "7" between both T3 and T4.

How many Mediatek and Qualcomm? Also, that list is old.

NVIDIA listed "design wins" and liked to show the list in their conferences. They stopped. Guess why?
 

badb0y

Member
Yeah, Apple loves sabotaging their products so much, they're going to make a special version of iOS 9 just so my aging iPhone 4S can live another year with software updates. /s

I thought most of GAF was above the current fashion of blind Apple hate.

Your iPhone 4S does not run iOS 8 efficiently, just because you are willing to accept subpar performance doesn't mean it's not garbage.

I own an iPhone 4,5, and 6. I see first hand the performance across the phones.
 

Sinistral

Member
If that's the case, then either amd is not reaching out or developers don't care to use their tech.

Or... CDProjektRed is not Rockstar. A 5 million deal for gameworks would be pennies for Rockstar but a substantial benefit for CDProjektRed. With the downgrades, console focused development and lying, I believe this is a case of being in over your head.

nVidia has money, and AMD not so much so it's easier to throw your weight around. Developers at that point might not have a choice when your Publisher, the ones handling your paycheck and business income say they're using a certain tech.
 

badb0y

Member
Their new statement has few issues for example they claim that TressFX works great on Nvidia hardware in Tomb Raider.

OP do you even research bro? TressFX works very well on nVidia hardware.
13632141234v2TkTbPdM_6_6.jpg
 

dex3108

Member
OP do you even research bro? TressFX works very well on nVidia hardware.
13632141234v2TkTbPdM_6_6.jpg

Yes i did research 2013. when TR was out and had issues on Nvidia systems, Nvidia drivers were late because Nvidia didn't have access to game before release. And before few patches and new driver TressFX worked really bad on Nvidia hardware.
 

Crisium

Member
And TressFX is open source. The thing with AMD GE games is that while Nvidia may be late, they can catch up. It doesn't work the other way around with GW.
 

Sinistral

Member
Yes i did research 2013. when TR was out and had issues on Nvidia systems, Nvidia drivers were late because Nvidia didn't have access to game before release. And before few patches and new driver TressFX worked really bad on Nvidia hardware.

And even without day one Drivers for Witcher 3, AMD users are able to tweak Tessellation amount that Kepler users can't. And AA amount for both AMD and Kepler. Showing that Drivers are largely irrelevant and it's the default overtuning of effects that are hurting everyone but Maxwell users. But still 780TI overall performance is questionable. While 290x is trading blows with 970s.
 

Kezen

Banned
Yes i did research 2013. when TR was out and had issues on Nvidia systems, Nvidia drivers were late because Nvidia didn't have access to game before release. And before few patches and new driver TressFX worked really bad on Nvidia hardware.

I can confirm that. Literally unplayable on my 670 before driver updates, even then framerate wasn't "great" but I suppose my GPU was to blame.
 

Sinistral

Member

Great link but, AMD does, it's mandatory and it's the current sad state of PC Game development. Drivers are huge blackboxes. Which why the notion of AMD's lack of support people parrot baffles me. Both companies HAVE to be involved on a driver level for developers to make a decently functioning game these days.

Mantle's main purpose was to alleviate the huge driver dependancy. Because DX12, Vulkan and Metal are built upon Mantles foundation, this is going to be a huge change to the landscape for those who utilize it.
 
Sounds like Nvidia is leveraging their market dominance to further dissuade consumers from purchasing AMD products. It's not going to make them look like the nicest company in the world, but it will work to further erode AMD's marketshare.
 

Soltype

Member
Or... CDProjektRed is not Rockstar. A 5 million deal for gameworks would be pennies for Rockstar but a substantial benefit for CDProjektRed. With the downgrades, console focused development and lying, I believe this is a case of being in over your head.

nVidia has money, and AMD not so much so it's easier to throw your weight around. Developers at that point might not have a choice when your Publisher, the ones handling your paycheck and business income say they're using a certain tech.

So amd can't keep up?, I'm not seeing how this makes nvidia bad.
 
http://www.extremetech.com/extreme/...surps-power-from-developers-end-users-and-amd

And this is from a year and a half ago, so consumers could have discouraged this.


You can read for yourself. It doesn't get any more official than this link here:

https://developer.nvidia.com/gameworks-sdk-eula


-----

Any Nvidia GW code inserted into the game cannot be modified, altered, optimized by the developer without Nvidia's written permission. That means if the developer uses a particular GW SDK for some effect such as tessellation or HBAO+ and the game runs poorly after, they either have to accept the performance hit or remove the SDK. It's take it or leave it.

Essentially what this means is a Black Box source code from Nvidia inside the game engine itself. Based on the EULA, that also means AMD (or Intel) cannot optimize the driver around Nvidia's GW code since it's closed and proprietary.

I recommend not encouraging this behavior.

That does seem pretty shady by Nvidia. In fact everything that I read about Nvidia makes them seem like a company with shitty business practices. Unfortunately they're dominant right now because AMD has been mismanaged for almost a decade.
 

Ke0

Member
Any Nvidia GW code inserted into the game cannot be modified, altered, optimized by the developer without Nvidia's written permission. That means if the developer uses a particular GW SDK for some effect such as tessellation or HBAO+ and the game runs poorly after, they either have to accept the performance hit or remove the SDK. It's take it or leave it.

Essentially what this means is a Black Box source code from Nvidia inside the game engine itself. Based on the EULA, that also means AMD (or Intel) cannot optimize the driver around Nvidia's GW code since it's closed and proprietary.

I recommend not encouraging this behavior.

And if AMD were a decently ran company this is the perfect area to dump millions into (not fucking desktop memory). All they need to do is provide a solution that is just as good and (this being very important) being as well documented and easy to implement as the competition and provide said developers as much engineer expertise as the competition would.

When developers have no alternatives, you can't fault Nvidia for seeing an opportunity when it arises.

It amazes me how people say Nvidia is the company with shitty business sense and practice when they're the company dominating every GPU based field they enter. They dominate because AMD let's them dominate these fields almost literally.

I mean if you ran a GPU company and saw that your competitors were announcing they were going to enter the ARM sector would follow or would you make your own announcement involving desktop memory modules? A field which has razor thin profit margins as is?
 
Top Bottom