• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD vs. Nvidia GameWorks in Witcher 3

Jedi2016

Member
Have to say I'm not really seeing what the problem is here. One company tries to make its products better in order to compete?

How many people bitching are AMD users? And how many are NVidia users? I honestly never understood this side of it. PC gamers claim to put themselves above the console crowd by not being fanboys one or the other, and yet it winds up being the exact same arguments over which is "better". I though PC gamers had a choice? Just buy what works best, fuck branding. There's no reason for loyalty here, there are no exclusives. Just buy what works best, the end.
 

Blackage

Member
This is super scummy and as an nvidia user it makes me very angry. I usually prefer nvidia but I may switch to AMD now.

Why would you switch to AMD if you're already using Nvidia though. It's like taking on Inferior performance just to prove a point to who exactly?
 

Aroll

Member
And if AMD were a decently ran company this is the perfect area to dump millions into (not fucking desktop memory). All they need to do is provide a solution that is just as good and (this being very important) being as well documented and easy to implement as the competition and provide said developers as much engineer expertise as the competition would.

When developers have no alternatives, you can't fault Nvidia for seeing an opportunity when it arises.

They did for hair works. It's called Tress FX and it was around first, and is easily employed on any GPU, is well documented, and has it's code out there, able to be altered via drivers.

This is pretty much what AMD has always done. As for "not desktop memory" - why not? HBM is the next great step after GDDR5. GDDR5 is currently used in mobile devices and laptops. Theoretically, HBM is going to eventually be used in said devices as well. Is it starting on the desktop side? Of course, but all the tech that goes into that stuff starts there before it's shrunk even further and optimized even more for smaller devices.

The question really isn't should they be focusing on HBM on a desktop, the question is - can they stay ahead on the tech. They don't make that sort of thing exclusive - that means that yes, NVidia cards are going to be HBM based eventually if it proves to really be that much better. So, once that happens, does AMD keep their edge on the tech, or do they lose it. Remember, AMD invented the GDDR tech, but within a year, they fell behind on making the best GPU's for it.
 

Aroll

Member
Why would you switch to AMD if you're already using Nvidia though. It's like taking on Inferior performance just to prove a point to who exactly?

It's more so about supporting a company that treats consumers better. I could go to one of 5 gas stations near my house. My choice is based on which one has the better customer service. The one I go to isn't the best one, it lacks on some beverages and is smaller, but they are much nicer to me when I am inside.

That's what this is about really. Yes, going with say, a 290x is not as good as say, a top end NVidia (titan, etc). But chances are they aren't going from a Titan to a 290x, but a older Nvidia to say, a 290x, which means the 290x is PROBABLY better than what they are using anyways, since the only comparable card is the 980, and that's only better for Witcher 3 because of GW. Not because the card is "so much better" than the 290x.
 
It's more so about supporting a company that treats consumers better. I could go to one of 5 gas stations near my house. My choice is based on which one has the better customer service. The one I go to isn't the best one, it lacks on some beverages and is smaller, but they are much nicer to me when I am inside.

That's what this is about really. Yes, going with say, a 290x is not as good as say, a top end NVidia (titan, etc). But chances are they aren't going from a Titan to a 290x, but a older Nvidia to say, a 290x, which means the 290x is PROBABLY better than what they are using anyways, since the only comparable card is the 980, and that's only better for Witcher 3 because of GW. Not because the card is "so much better" than the 290x.

I'm not gonna pretend Nvidia's a nice company looking out for their customers, but neither are AMD and the infrequency of their driver updates is a perfect example of that. Neither company is going to treat you nicely because they both share one common interest and one sole interest, your cash. Between Nvidia and AMD it's hard to pick a lesser of the two evils since they're both shit at different things.
 

gogogow

Member
They did for hair works. It's called Tress FX and it was around first, and is easily employed on any GPU, is well documented, and has it's code out there, able to be altered via drivers.

This is pretty much what AMD has always done. As for "not desktop memory" - why not? HBM is the next great step after GDDR5. GDDR5 is currently used in mobile devices and laptops. Theoretically, HBM is going to eventually be used in said devices as well. Is it starting on the desktop side? Of course, but all the tech that goes into that stuff starts there before it's shrunk even further and optimized even more for smaller devices.

The question really isn't should they be focusing on HBM on a desktop, the question is - can they stay ahead on the tech. They don't make that sort of thing exclusive - that means that yes, NVidia cards are going to be HBM based eventually if it proves to really be that much better. So, once that happens, does AMD keep their edge on the tech, or do they lose it. Remember, AMD invented the GDDR tech, but within a year, they fell behind on making the best GPU's for it.

Pretty sure Ke0 means AMD DDR memory.
 

Aroll

Member
I'm not gonna pretend Nvidia's a nice company looking out for their customers, but neither are AMD and the infrequency of their driver updates is a perfect example of that. Neither company is going to treat you nicely because they both share one common interest and one sole interest, your cash. Between Nvidia and AMD it's hard to pick a lesser of the two evils since they're both shit at different things.

AMD has pretty decent driver updates, and when they don't, they offer you the ability to customize things natively yourself. So, it's STILL a more open platform.

Take this new drama - without source code, you can never fully optimize the drivers. Did Nvidia do it for Tomb Raider? Sure, after a bit of time, but they could do it easier because the source code was there and could be changed. AMD - a lot of time their slowness is DUE to NVidia. They have to work around them.

That being said, NVidia has been consistently at the forefront of making the best GPUs on the market for awhile. Can't fault their popularity when the #1 thing that matters the most, they do. At least, until HBM comes out. Assuming it's more affordable than a Titan and it happens to somehow outperform it, AMD may have that edge back again.

I use both. I love the 970m in my laptop - trust it over AMD's top option and i get better FPS in general on close to max settings. But desktop wise, it's been a pretty big wash between the 980/290x for me, and those are the most affordable "gamer" cards that aren't low to mid range capable.
 

Sinistral

Member
And if AMD were a decently ran company this is the perfect area to dump millions into (not fucking desktop memory). All they need to do is provide a solution that is just as good and (this being very important) being as well documented and easy to implement as the competition and provide said developers as much engineer expertise as the competition would.

When developers have no alternatives, you can't fault Nvidia for seeing an opportunity when it arises.

It amazes me how people say Nvidia is the company with shitty business sense and practice when they're the company dominating every GPU based field they enter. They dominate because AMD let's them dominate these fields almost literally.

I mean if you ran a GPU company and saw that your competitors were announcing they were going to enter the ARM sector would follow or would you make your own announcement involving desktop memory modules? A field which has razor thin profit margins as is?

Your choice for arguments is real funny and you're clearly stabbing in the dark. I won't deny about AMDs poor managemet. Tegra is by far and wide a failure of a product. Great tech, but absolutely no market penetration. But maybe nVidia can make it work in the Automobile sector or when they force it onto their GPUs.

The AMD branded ram is laughable I agree. AMD does have ventures into ARM, look into future K12 and previously A1100 products. They spent millions into ARM but it was a dead end and are scaling back their efforts to focus on Zen.

Meanwhile, AMD also invested millions into Mantle, the baseline for DX12, Vulkan and Metal. They created GDDR3, and GDDR5 and are now coming out with HBM. All revolutionary steps in memory. That NVidia will benefit from as well. The biggest failings AMD have are not capitalizing on their strengths in a public forward way. And that's entirely on them.
 

Ke0

Member
They did for hair works. It's called Tress FX and it was around first, and is easily employed on any GPU, is well documented, and has it's code out there, able to be altered via drivers.

And they've done absolutely fuck all to market TressFX. TressFX is out and about but guess who has to do most of the grunt work to make it play nice with existing toolchains? And it's not "well documented" not nearly as well as Hairworks is. The day Hairworks dropped, people all over forums had samples up and running, Epic was able to integrate much into their UE without much fuss. There is a huge difference between "documented" and "well documented". AMD's entire open source model is predicated on the idea that the community will do most of the grunt work. Yea good luck with that in the face of proprietary software that is not only easier to implement but much better documented with sample code galore. You want to integrate Gameworks on your mobile game? Here's a shitload of documentation and sample codes on how to do it for both Android and iOS.

This is pretty much what AMD has always done. As for "not desktop memory" - why not? HBM is the next great step after GDDR5. GDDR5 is currently used in mobile devices and laptops. Theoretically, HBM is going to eventually be used in said devices as well. Is it starting on the desktop side? Of course, but all the tech that goes into that stuff starts there before it's shrunk even further and optimized even more for smaller devices.

Neither HBM nor GDDR5 are "desktop modules". I'm clearly talking about absolutely nonsensical idea http://www.amd.com/en-us/products/memory/overview#
R9-Gamer-Double-Angle-Top-View-230W.png


You know what the sales metrics are for their desktop memory modules? Shit, absolute shit. You know who's fault that's not? Nvidia's. You know what Nvidia was looking into when murmurings about AMD getting into memory started becoming a thing?

Badge_Tegra_3D_large.jpg


Guess which one lead to a better financial situation. This was an area AMD could have competed in because they had just as much expertise. Yet AMD ignored ARM and mobile for god knows how long because? They let Nvidia makes millions/billions off Tegra as they sat there doing nothing. They didn't even try, they just looked at Nvidia and basically shrugged saying "well...there's that!"

Nvidia used the fact they had a sizeable market with Tegra to introduce software solutions for mobile devices that developers could use(for both their own tegra chips and for other devices...much like the desktop space). Later Nvidia rolled those years of documentation and sample coding and bundled it with their OpitX solution that they created off the back of courting the HPC market (yet another area AMD let Nvidia have), and much of their physx documentation, sample code and expertise to create Gameworks.

You guys wonder why developers choose gameworks? Because Nvidia put in the time to make gameworks appealing. You want to know why developers use gameworks? Because Nvidia put together years of varying implementation, documentation, sessions with companies to make it happen. Gameworks didn't appear out of thin air. This was years in the making, all of us were privy to the pieces that led to the creation of Gameworks. You can trace every part of Gameworks back to Nvidia making hardware inroads to different sectors. Sectors that AMD chose to ignore, and somehow this is apparently Nvidia's fault.

Nvidia is currently getting into the automated car industry, they're providing resources to these companies. I can guarantee in years time this too will trickle down to gaming much like everything else Nvidia has done does.

The question really isn't should they be focusing on HBM on a desktop, the question is - can they stay ahead on the tech. They don't make that sort of thing exclusive - that means that yes, NVidia cards are going to be HBM based eventually if it proves to really be that much better. So, once that happens, does AMD keep their edge on the tech, or do they lose it. Remember, AMD invented the GDDR tech, but within a year, they fell behind on making the best GPU's for it.

And Nvidia invented fully programmable shaders. Compete.
 

Belmire

Member
Is the lack of an AMD Crossfire profile for Witcher 3 also because of Nvidia's Gameworks? 295x2 users must not be very happy.
 

Crisium

Member
So amd can't keep up?, I'm not seeing how this makes nvidia bad.

As I said before, this goes everything against what PC gaming has stood for for decades. GPU hardware makers should never be allowed to alter or modify any game code in a game unless it's open source. AMD and NV are hardware providers, not game developers. What if Intel spent billions of dollars and injected proprietary code in all AAA games for Intel GPUs? They could bury Nvidia.

Sounds like your definition of fair market competition involves whichever company has the most money winning.
 

Sinistral

Member
Badge_Tegra_3D_large.jpg


Guess which one lead to a better financial situation. This was an area AMD could have competed in because they had just as much expertise. Yet AMD ignored ARM and mobile for god knows how long because? They let Nvidia makes millions/billions off Tegra as they sat there doing nothing. They didn't even try, they just looked at Nvidia and basically shrugged saying "well...there's that!"

Which are down in sales:
http://www.cnet.com/au/news/nvidia-4q-beats-expectations-on-strong-graphic-chip-sales/

Sales of graphics chips rose 13 percent, to $1.07 billion, thanks to strong demand for the GeForce chip line for gaming PCs, while Tegra sales dropped 15 percent, to $112 million.

Billions off Tegra... haha sure.
 

Soltype

Member
That's because I never said NVidia was bad, read my previous posts in this thread.
I apologize, my mistake.

As I said before, this goes everything against what PC gaming has stood for for decades. GPU hardware makers should never be allowed to alter or modify any game code in a game unless it's open source. AMD and NV are hardware providers, not game developers. What if Intel spent billions of dollars and injected proprietary code in all AAA games for Intel GPUs? They could bury Nvidia.

Sounds like your definition of fair market competition involves whichever company has the most money winning.
I thought all the nvidia bonus features were optional.If not , then you might have a point.
 

Crisium

Member
I thought all the nvidia bonus features were optional.If not , then you might have a point.

What happens if Nvidia grows 10-100x the size in the next 20-30 years? For the next generation of young PC gamers, should they face a situation where Nvidia's SDK is used to create the entire physics model for a AAA game via PhysX, the entire lightning/rain/shadow model via GW SDK, the entire tessellation model for the city? If we go down this path, the bigger and more financially strong the hardware developer grows, the more this firm can alter/shape PC game development, in turn this firm is slowly becoming a software developer that forces you to buy their hardware to play the game.

Things like SM3.0, tessellation, HBAO, should all be open source. Imagine a world where starting from GeForce 2, every single next generation feature from compute shaders to global illumination to SM3.0 to tessellation was inserted into AAA games as part of Nvidia's proprietary GW's SDK? PC gaming would not be what it is today.

Even just a few years ago, Nvidia was not this bad. Nvidia worked with CryTek to push SM3.0, they pushed tessellation in Batman and HAWX games. But all of those next gen features were completely open source. Now they have a decisive market leverage and are using to to snuff out the competition by taking deeper steps into proprietary software. This is a dangerous trend, period.
 

Belmire

Member
I thought all the nvidia bonus features were optional.If not , then you might have a point.

Seems to be that way with Witcher 3, hairworks can be turned off. Pcars on the other hand has physx in its engine which cannot be turned off but is not GPU accelerated, it's CPU accelerated, even on Nvidia. AMD users are furious and are blaming physx for horrible performance. AMD has not released a driver for either of those 2 games. If AMD were to release a driver for Pcars and we see major improvements, I would assume there is truth behind physx being CPU accelerated, as it would be vendor neutral.
 

Soltype

Member
What happens if Nvidia grows 10-100x the size in the next 20-30 years? For the next generation of young PC gamers, should they face a situation where Nvidia's SDK is used to create the entire physics model for a AAA game via PhysX, the entire lightning/rain/shadow model via GW SDK, the entire tessellation model for the city? If we go down this path, the bigger and more financially strong the hardware developer grows, the more this firm can alter/shape PC game development, in turn this firm is slowly becoming a software developer that forces you to buy their hardware to play the game.

Things like SM3.0, tessellation, HBAO, should all be open source. Imagine a world where starting from GeForce 2, every single next generation feature from compute shaders to global illumination to SM3.0 to tessellation was inserted into AAA games as part of Nvidia's proprietary GW's SDK? PC gaming would not be what it is today.

Even just a few years ago, Nvidia was not this bad. Nvidia worked with CryTek to push SM3.0, they pushed tessellation in Batman and HAWX games. But all of those next gen features were completely open source. Now they have a decisive market leverage and are using to to snuff out the competition by taking deeper steps into proprietary software. This is a dangerous trend, period.

Are you suggesting that nvidia spend money on developing new technology and then release it for free?I see what youre saying with the nvidia future, but what's the alternative?They have been very good about giving me reasons to buy their products,I can't say the same about amd.
 
Your choice for arguments is real funny and you're clearly are stabbing in the dark. I won't deny about AMDs poor managemet. Tegra is by far and wide a failure of a product. Great tech, but absolutely no market penetration. But maybe nVidia can make it work in the Automobile sector or when they force it onto their GPUs.

The AMD branded ram is laughable I agree. AMD does have ventures into ARM, look into future K12 and previously A1100 products. They spent millions into ARM but it was a dead end and are scaling back their efforts to focus on Zen.

Meanwhile, AMD also invested millions into Mantle, the baseline for DX12, Vulkan and Metal. They created GDDR3, and GDDR5 and are now coming out with HBM. All revolutionary steps in memory. That NVidia will benefit from as well. The biggest failings AMD have are not capitalizing on their strengths in a public forward way. And that's entirely on them.

Slightly OT but Tegra in regards to the car industry has/will be adopted by the VAG company.

Most new Audis will have it implemented, the new model of the TT is just the beginning with it then spreading out to other partners/models.
 

dex3108

Member
Seems to be that way with Witcher 3, hairworks can be turned off. Pcars on the other hand has physx in its engine which cannot be turned off but is not GPU accelerated, it's CPU accelerated, even on Nvidia. AMD users are furious and are blaming physx for horrible performance. AMD has not released a driver for either of those 2 games. If AMD were to release a driver for Pcars and we see major improvements, I would assume there is truth behind physx being CPU accelerated, as it would be vendor neutral.

PhysX is part of Unity Engine and Unreal 4 Engine and yet i don't see users or AMD complaining. Also how many games are out there with those engines and we don;t see many complains at all.
 

Ke0

Member
Your choice for arguments is real funny and you're clearly are stabbing in the dark. I won't deny about AMDs poor managemet. Tegra is by far and wide a failure of a product. Great tech, but absolutely no market penetration. But maybe nVidia can make it work in the Automobile sector or when they force it onto their GPUs.

You're arguing nonsense. Tegra has borught in Nvidia quite a bit of profit even for a failure of a product, and it gave them huge inroads into mobile development.

The AMD branded ram is laughable I agree. AMD does have ventures into ARM, look into future K12 and previously A1100 products. They spent millions into ARM but it was a dead end and are scaling back their efforts to focus on Zen.

Yea AMD got into ARM via Operton A series how many damn years later? Exactly the point. Go on Hardforums and read a few people who's company got in AMD's early access for Operton A, and most of them will say the samething, they're slow and just can't compete. They have to create software to utilize random proprietary coprocessors just to bring performance up to par (and many times below) Intel's offerings.

Meanwhile, AMD also invested millions into Mantle, the baseline for DX12, Vulkan and Metal. They created GDDR3, and GDDR5 and are now coming out with HBM. All revolutionary steps in memory. That NVidia will benefit from as well. The biggest failings AMD have are not capitalizing on their strengths in a public forward way. And that's entirely on them.

Which is what all of us non frenzied people have been saying, you people need to place blame where it needs to lay and that's at AMD's feet. It's not Nvidia's fault that AMD's business acumen is poor.

Just because AMD likes to run like a non-profit and makes no attempts to capitalize on their expenditures doesn't mean that Nvidia has to follow suit. I mean AMD helped JEDEC create the GDDR3 spec and it took them how long to bring out a card that used it compared to Nvidia?
 

Ke0

Member

I'm sorry did Tegra just now exist in Q2 2015? Pretty sure Tegra came to fruition in 2008/2007, and I'm fairly positive from 2007 to 2015 that yes Tegra has definitely brought in quite a nice bit of profits. Whether that be millions or billions you're still splitting hairs and deflecting from the truth of the matter. It's still a sector that helped Nvidia's bottom line that AMD completely ignored, and was another sector that Nvidia could push their software solutions. And contradicts the narrative that Tegra has been a financial bust for Nvidia when clearly it hasn't.

This was a sector that AMD could have easily gotten into and competed with Nvidia and had ample time to do such, but they didn't. And that blame is on AMD, no one can damn Nvidia for doing what AMD didn't.

If AMD were to enter that sector tomorrow Nvidia would be under no obligation to play nice and open source their solutions.

Nvidia actually got most of the big boys. I wonder what Mercedes and Ford are using?!

Snapdragons, Qualcomm parts, etc.
 

Sinistral

Member
You're arguing nonsense. Tegra has borught in Nvidia quite a bit of profit even for a failure of a product, and it gave them huge inroads into mobile development.

Lol, show me the receipts. Sales last quarter was 112 million, a downward trend. That's Gross, not net. This new car stuff is GREAT! Looks like it may start trending upward. But how long will that last? Nvidia had the last generation of Consoles, even the original Xbox. Hopefully they've learned.

Yea AMD got into ARM via Operton A series how many damn years later? Exactly the point. Go on Hardforums and read a few people who's company got in AMD's early access for Operton A, and most of them will say the samething, they're slow and just can't compete. They have to create software to utilize random proprietary coprocessors just to bring performance up to par (and many times below) Intel's offerings.

Thats not what you said:

I mean if you ran a GPU company and saw that your competitors were announcing they were going to enter the ARM sector would follow or would you make your own announcement involving desktop memory modules

You were saying they never even glanced at ARM, now you're backtracking. And AMD branded Ram came WAY later than any ARM initiatives. Not to mention the investments on these two fronts are of widely different scales.
Which is what all of us non frenzied people have been saying, you people need to place blame where it needs to lay and that's at AMD's feet. It's not Nvidia's fault that AMD's business acumen is poor.

You are coming off as frenzied. And your arguments are juvenile. I am blaming AMD. That's what I've been saying all along... But I blame them for other things, just as much as I am blaming NVidia and Game companies for the current situation.

Just because AMD likes to run like a non-profit and makes no attempts to capitalize on their expenditures doesn't mean that Nvidia has to follow suit. I mean AMD helped JEDEC create the GDDR3 spec and it took them how long to bring out a card that used it compared to Nvidia?

And when GDDR5 was released, their memory controllers were far ahead of nVidias. They let that slip away of course, that's really where the cards started to fall. I hope this is a repeat for HBM without them slipping. AMD will have a generation ahead of HBM implementation. HBM2 is largely the same, and just expands on capacity.

I'm not calling for AMD to fight fire with fire. That would be to the detriment of PC gaming. And I'd be the first to leave PC gaming if there was only one vendor to buy stuff from. But their war of Open vs. Blackbox is a good one, that they should fight and needs to have a stronger effort on AMD to convince the masses that their way is better. It's what NVidia is currently doing. Successfully and rightfully so.

A poorly recorded and edited video is poor showmanship.
 

Sinistral

Member
I'm sorry did Tegra just now exist in Q2 2015? Pretty sure Tegra came to fruition in 2008/2007, and I'm fairly positive from 2007 to 2015 that yes Tegra has definitely brought in quite a nice bit of profits. Whether that be millions or billions you're still splitting hairs and deflecting from the truth of the matter. It's still a sector that helped Nvidia's bottom line that AMD completely ignored, and was another sector that Nvidia could push their software solutions. And contradicts the narrative that Tegra has been a financial bust for Nvidia when clearly it hasn't.

This was a sector that AMD could have easily gotten into and competed with Nvidia and had ample time to do such, but they didn't. And that blame is on AMD, no one can damn Nvidia for doing what AMD didn't.

If AMD were to enter that sector tomorrow Nvidia would be under no obligation to play nice and open source their solutions.

Snapdragons, Qualcomm parts, etc.

Except ARM is already an Open Architecture. They have no choice but to play on even ground. Unlike Gameworks.

Do you understand the difference between Gross and Net income? Please show me where NVidia is pulling a profit from Tegra. All I can find are sales, which are usually indicative of Gross.

2014 was Tegras biggest year, if the car market is finally able to turn Tegra into a viable long term product, great. But don't deny that before that NVidia was not even a blip in the ARM market.

Qualcomm%20new.png


Yup, AMD is even worse here, which is why they are wisely scaling back their efforts, supplying MediaTek with graphics and focusing on Zen.
 

OmegaDL50

Member
AMD has pretty decent driver updates, and when they don't, they offer you the ability to customize things natively yourself. So, it's STILL a more open platform.

Being an owner of an HD7950, I find this dubious at worst, highly questionable at best.

We are talking about the same AMD that won't let me use the Native driver based supersampling function called VSR because I'm still using a 7000 series graphics card.

Yet the R9 280 is a rebrand of the HD7950. This is the exact same card, The only difference is the name, and AMD had the gall to claim that the R9 series has a hardware scaler that made VSR only possible on the R9 series.

Turns out this so called hardware scaler was just bullshit PR speak, because with a registry edit and custom driver VSR worked on the 7000 series cards like it did on the R9, imagine that. What it narrows down to was a "business decision" that AMD cut off features that FULLY WORK on a revision of graphics cards that was intentionally removed just to make it look the R9 series more appealing to upgrade to.

I would definitely say that it does not make it a more open platform, If anything it says the opposite.

http://forums.guru3d.com/showthread.php?t=397875
 

Rur0ni

Member
And they've done absolutely fuck all to market TressFX. TressFX is out and about but guess who has to do most of the grunt work to make it play nice with existing toolchains? And it's not "well documented" not nearly as well as Hairworks is. The day Hairworks dropped, people all over forums had samples up and running, Epic was able to integrate much into their UE without much fuss. There is a huge difference between "documented" and "well documented". AMD's entire open source model is predicated on the idea that the community will do most of the grunt work. Yea good luck with that in the face of proprietary software that is not only easier to implement but much better documented with sample code galore. You want to integrate Gameworks on your mobile game? Here's a shitload of documentation and sample codes on how to do it for both Android and iOS.



Neither HBM nor GDDR5 are "desktop modules". I'm clearly talking about absolutely nonsensical idea http://www.amd.com/en-us/products/memory/overview#
R9-Gamer-Double-Angle-Top-View-230W.png


You know what the sales metrics are for their desktop memory modules? Shit, absolute shit. You know who's fault that's not? Nvidia's. You know what Nvidia was looking into when murmurings about AMD getting into memory started becoming a thing?

Badge_Tegra_3D_large.jpg


Guess which one lead to a better financial situation. This was an area AMD could have competed in because they had just as much expertise. Yet AMD ignored ARM and mobile for god knows how long because? They let Nvidia makes millions/billions off Tegra as they sat there doing nothing. They didn't even try, they just looked at Nvidia and basically shrugged saying "well...there's that!"

Nvidia used the fact they had a sizeable market with Tegra to introduce software solutions for mobile devices that developers could use(for both their own tegra chips and for other devices...much like the desktop space). Later Nvidia rolled those years of documentation and sample coding and bundled it with their OpitX solution that they created off the back of courting the HPC market (yet another area AMD let Nvidia have), and much of their physx documentation, sample code and expertise to create Gameworks.

You guys wonder why developers choose gameworks? Because Nvidia put in the time to make gameworks appealing. You want to know why developers use gameworks? Because Nvidia put together years of varying implementation, documentation, sessions with companies to make it happen. Gameworks didn't appear out of thin air. This was years in the making, all of us were privy to the pieces that led to the creation of Gameworks. You can trace every part of Gameworks back to Nvidia making hardware inroads to different sectors. Sectors that AMD chose to ignore, and somehow this is apparently Nvidia's fault.

Nvidia is currently getting into the automated car industry, they're providing resources to these companies. I can guarantee in years time this too will trickle down to gaming much like everything else Nvidia has done does.



And Nvidia invented fully programmable shaders. Compete.
Good post. I like an underdog story as much as anyone, but nVidia executes. This goes beyond gaming.
 
Holy fuck they approached CPR to add TressFX two months before the game went gold?!

And from this article: http://arstechnica.co.uk/gaming/201...s-completely-sabotaged-witcher-3-performance/

Most of the time we optimize games based on binary builds, not source code... I believe it is a resource issue. Nvidia spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs [independent hardware vendors]/"

So they not only didn't approach CPR to add TressFX during the years of the game's development, but they also aren't able to optimize from binaries like Nvidia can? Why do they keep blaming it on source code if Nvidia doesn't even use it for their optimizations?

And how come AMD/ATI's tessellation performance has been so shitty for so many years?
 

gogogow

Member
2014 was Tegras biggest year, if the car market is finally able to turn Tegra into a viable long term product, great. But don't deny that before that NVidia was not even a blip in the ARM market.

I don't think that matters at all. If Nvidia didn't went into that market early, they wouldn't have chips to sell to the car manufacturers and now they have contracts with the majority of the big players. All in all, it was a good move to enter that market. They have very powerful and mature SoCs, because they kept R&D-ing their Tegra chips.

If AMDs DDR memory were to sell millions of units next year, a lot of people would have to eat crow too, but I kinda doubt that would happen.
 

Sinistral

Member
Holy fuck they approached CPR to add TressFX two months before the game went gold?!

And from this article: http://arstechnica.co.uk/gaming/201...s-completely-sabotaged-witcher-3-performance/



So they not only didn't approach CPR to add TressFX during the years of the game's development, but they also aren't able to optimize from binaries like Nvidia can? Why do they keep blaming it on source code if Nvidia doesn't even use it for their optimizations?

And how come AMD/ATI's tessellation performance has been so shitty for so many years?

That's because Hairworks wasn't implemented into the main game branch until 2 months till release. When it was, performance was terrible. AMD offered TressFX but, CDProjektRED said it was too late.

Go back 2 years or so, and CDProjektRED is showing use cases with wolves and Hairworks, and is a confirmed Gameworks title. Their contract was signed. Begin the overpromising, lying and failing to become the bastion of PC gaming CDProjekt was heralded to be.
 

Sinistral

Member
I don't think that matters at all. If Nvidia didn't went into that market early, they wouldn't have chips to sell to the car manufacturers and now they have contracts with the majority of the big players. All in all, it was a good move to enter that market. They have very powerful and mature SoCs, because they kept R&D-ing their Tegra chips.

If AMDs DDR memory were to sell millions of units next year, a lot of people would have to eat crow too, but I kinda doubt that would happen.

I don't deny that, I initially said Tegra was nice tech, but had no market penetration. But this Tegra example was a direct show that AMD does not have any good products. AMD Branded DDR3 Ram being the example to prove this point. Which is no where near the same scope, scale or implications on any industry as Tegra.
 

Ke0

Member
Except ARM is already an Open Architecture. They have no choice but to play on even ground. Unlike Gameworks.

Which means that AMD had no excuse as to why they didn't cut Nvidia and make inroads.

Do you understand the difference between Gross and Net income? Please show me where NVidia is pulling a profit from Tegra. All I can find are sales, which are usually indicative of Gross.

I do, now show me the operating expenditures for Tegra, we know it's gross. You're basically arguing that Nvidia has never made a profit on Tegra based on absolutely nothing. If Tegra were bleeding Nvidia, it would have shown in their weaker quarters. But let Nvidia themselves tell it Q2 of 2012 saw profits off the back of Tegra. How much did Tegra contribute to the $119m? Is anyone's guess, but I doubt Nvidia would be telling stockholders that if Tegra was bleeding them. I'd wager Nvidia makes more off Tegra than AMD ever has off that desktop ram and PCI-E storage. I mean what's the long term strategy with those two things exactly?

Regardless you're arguing just to argue. The point is, we can literally trace how Gameworks came to fruition and how it became used in many different engines by the sectors Nvidia went after. Everything from gaming is trickled down from these other sectors. There's nothing questionable about how Nvidia got to where they are, their competition let them.

2014 was Tegras biggest year, if the car market is finally able to turn Tegra into a viable long term product, great. But don't deny that before that NVidia was not even a blip in the ARM market.

You don't need to dominate the ARM market to find success in it. Nvidia found a niche that works for them.

I don't think that matters at all. If Nvidia didn't went into that market early, they wouldn't have chips to sell to the car manufacturers and now they have contracts with the majority of the big players. All in all, it was a good move to enter that market. They have very powerful and mature SoCs, because they kept R&D-ing their Tegra chips.

If AMDs DDR memory were to sell millions of units next year, a lot of people would have to eat crow too, but I kinda doubt that would happen.

AMD's DDR memory has been on the market for what 1 or two years now? It doesn't sell because there are better alternatives. Desktop memory is a ridiculously crowded market with razor thin profit margins.

AMD's memory doesn't benchmark any faster than any other brand and there is no like bonus performance when dropping them into AMD systems.

That's because Hairworks wasn't implemented into the main game branch until 2 months till release. When it was, performance was terrible. AMD offered TressFX but, CDProjektRED said it was too late.

Go back 2 years or so, and CDProjektRED is showing use cases with wolves and Hairworks, and is a confirmed Gameworks title. Their contract was signed. Begin the overpromising, lying and failing to become the bastion of PC gaming CDProjekt was heralded to be.

Got a source on this?
 

Sinistral

Member
I do, now show me the operating expenditures for Tegra, we know it's gross. You're basically arguing that Nvidia has never made a profit on Tegra based on absolutely nothing. If Tegra were bleeding Nvidia, it would have shown in their weaker quarters. But let Nvidia themselves tell it Q2 of 2012 saw profits off the back of Tegra. How much did Tegra contribute to the $119m? Is anyone's guess, but I doubt Nvidia would be telling stockholders that if Tegra was bleeding them. I'd wager Nvidia makes more off Tegra than AMD ever has off that desktop ram and PCI-E storage. I mean what's the long term strategy with those two things exactly?

Regardless you're arguing just to argue. The point is, we can literally trace how Gameworks came to fruition and how it became used in many different engines by the sectors Nvidia went after. Everything from gaming is trickled down from these other sectors. There's nothing questionable about how Nvidia got to where they are, their competition let them.



You don't need to dominate the entire ARM market. Nvidia found a niche that works for them.


So you can't show me the receipts and call for me to argue with myself. I doubt Tegra is actually bringing a profit. I can go on believing this as much as you continue to believe what you want about AMD.

I've argued and shown proof that it is not a significant portion of nVidias sales. It is not making the Billions of profit like you've said, it has no chance to with how little sales it brings in a quarter. I don't know how much they're spending on R&D, Personel, Marketing, etc for it. I can't find sources. You're asking me to argue with myself now. This whole argument is ridiculous. All I have to go off of Tegra's profit is your word. Let it end. I'm done. Internet arguing, lol.

We'll have to see if it does work for NVidia. The cars are just starting. You remember nForce motherboards? How long was that tenure? Where is that business venture now? How about the consoles?

Got a source on this?

Why are you even in this thread if you don't want to take in the information in the OP?
 

NeOak

Member
Which means that AMD had no excuse as to why they didn't cut Nvidia and make inroads.



I do, now show me the operating expenditures for Tegra, we know it's gross. You're basically arguing that Nvidia has never made a profit on Tegra based on absolutely nothing. If Tegra were bleeding Nvidia, it would have shown in their weaker quarters. But let Nvidia themselves tell it Q2 of 2012 saw profits off the back of Tegra. How much did Tegra contribute to the $119m? Is anyone's guess, but I doubt Nvidia would be telling stockholders that if Tegra was bleeding them. I'd wager Nvidia makes more off Tegra than AMD ever has off that desktop ram and PCI-E storage. I mean what's the long term strategy with those two things exactly?
Ke0 please stop touting Tegra as a success. it was a fucking failure. NVidia couldn't deliver the thermals it promised at the performance it promised. That is SUICIDE in the embedded world and there are NO second chances. None.

You forget that Tegra 2 was a turd because it didn't have NEON? Tegra 3 was shit, had to be actively cooled (lawl fans on cellphones, right?) to get decent clocks. Lets not forget the god awful eMMC controller that kills performance on the Nexus 7 2012. Tegra 4? lol. Tegra 4i? Blackphone took the chips for free basically because no one else wanted them.

Regardless you're arguing just to argue. The point is, we can literally trace how Gameworks came to fruition and how it became used in many different engines by the sectors Nvidia went after. Everything from gaming is trickled down from these other sectors. There's nothing questionable about how Nvidia got to where they are, their competition let them.



You don't need to dominate the ARM market to find success in it. Nvidia found a niche that works for them.

Tegra is a failure in terms of market and money. It is dead for OEMs and ODMs. Why do you think they are suing other companies for "graphics" related stuff after no one licensed their shit because it was pure smoke?


AMD's DDR memory has been on the market for what 1 or two years now? It doesn't sell because there are better alternatives. Desktop memory is a ridiculously crowded market with razor thin profit margins.

AMD's memory doesn't benchmark any faster than any other brand and there is no like bonus performance when dropping them into AMD systems.

As for the RAM, I said it before and repeat it: It was marketing and it is marketing. You say that as if they had to do ANYTHING with it besides a "AMD Radeon" sticker on the DIMMs and boxes.



Got a source on this?

READ THE FUCKING OP
 
That's because Hairworks wasn't implemented into the main game branch until 2 months till release. When it was, performance was terrible. AMD offered TressFX but, CDProjektRED said it was too late.

Go back 2 years or so, and CDProjektRED is showing use cases with wolves and Hairworks, and is a confirmed Gameworks title. Their contract was signed. Begin the overpromising, lying and failing to become the bastion of PC gaming CDProjekt was heralded to be.

Of course it was too late! Why wouldn't AMD push TressFX at the same time as hairworks was was announced? What was going on there? Why did they wait until the very end to say "Oh shit, we don't know how to optimize our drivers for Nvidia's hair technology. Fuck. Wait, we have hair technology too.

Hey CPR, I know you guys are just patching bugs and shit, but we don't know how to optimize stuff from a binary. How would you like to implement a TOTALLY DIFFERENT hair technology in two weeks?"

What the fucking shit?! The Forbes author rags on them for this, and he's the first writer I've seen say WTF about that. That's the most WTF developer relations thing I've seen recently.
 

Sinistral

Member
Of course it was too late! Why wouldn't AMD push TressFX at the same time as hairworks was was announced? What was going on there? Why did they wait until the very end to say "Oh shit, we don't know how to optimize our drivers for Nvidia's hair technology. Fuck. Wait, we have hair technology too.

Hey CPR, I know you guys are just patching bugs and shit, but we don't know how to optimize stuff from a binary. How would you like to implement a TOTALLY DIFFERENT hair technology in two weeks?"

What the fucking shit?! The Forbes author rags on them for this, and he's the first writer I've seen say WTF about that. That's the most WTF developer relations thing I've seen recently.

Contracts, binary blobs, NDAs. The bastion of PC gaming, a dev house that struggles live up to the visual promises of a game long revealed that was supposed to be PC Centric, turns out consolized. A fat wad of bills is what happened.

Why wasn't TressFX pushed after the use of Hairworks was announced? Because it was already fucking announced. They've been goaded. Nvidia, with it's superior market shares and marketing pressure paved way. AMD could try, but if there's no incentive, then there's no reason. Both AMD's fault for not being able to give incentives and CDProjektRED for being in the need.

GTA5 for PC, both technologies from both camps, and equal and expected performance on everyones cards. Rockstar doesn't need nVidias pittance. CDProjecktRed probably did, and were reliant on it as Witcher 1 and 2 were already Gameworks titles.

Forbes is shit. It's nothing but trash. Look at their previous article of the issue.
 
Contracts, binary blobs, NDAs. The bastion of PC gaming, a dev house that struggles live up to the visual promises of a game long revealed that was supposed to be PC Centric, turns out consolized. A fat wad of bills is what happened.

Why wasn't TressFX pushed after the use of Hairworks was announced? Because it was already fucking announced. They've been goaded. Nvidia, with it's superior market shares and marketing pressure paved way. AMD could try, but if there's no incentive, then there's no reason. Both AMD's fault for not being able to give incentives and CDProjektRED for being in the need.

GTA5 for PC, both technologies from both camps, and equal and expected performance on everyones cards. Rockstar doesn't need nVidias pittance. CDProjecktRed probably did, and were reliant on it as Witcher 1 and 2 were already Gameworks titles.

Forbes is shit. It's nothing but trash. Look at their previous article of the issue.

Yes, one of the biggest problems in these situations is how eagerly game developers sell out a portion of their paying customers in exchange for "development assistance" from IHVs. Rockstar obviously has no need to, or interest in, getting themselves, or their customers screwed by bad contracts from either nVidia or AMD.
 
Yes, one of the biggest problems in these situations is how eagerly game developers sell out a portion of their paying customers in exchange for "development assistance" from IHVs. Rockstar obviously has no need to, or interest in, getting themselves, or their customers screwed by bad contracts from either nVidia or AMD.

Either the something is going to look like shit or something is going to look better because IHVs reached out to developers. Then the proprietary stuff all gets rolled into a random version of Direct X a couple of years later and becomes standard kit that everyone uses.

Welcome to the last two decades of 3D on PCs.
 

catbrush

Member
Nvidia is playing a zero-sum game with the PC market. It is the very definition of anti-competitive practice.

If you support this, or think AMD should do the same to "compete", you're a fanboy.

If it makes you happy to know other people are having a bad experience, you are not a nice person.

If you think it's worth spending an extra $300 for unoptimized hair effects, question your financial investment decisions.

On the flip side, Square Enix integrated parts of Gameworks into FFXIV, and judging by the new benchmark it seems to run great on AMD and Nvidia, so I blame the developers more than anything.
 
Either the something is going to look like shit or something is going to look better because IHVs reached out to developers. Then the proprietary stuff all gets rolled into a random version of Direct X a couple of years later and becomes standard kit that everyone uses.

Welcome to the last two decades of 3D on PCs.

Bullshit. The fact is these developers manage to ship multiple versions of the same game on consoles that include no code provided by nVidia so the idea that the PC versions can't be competently made without IHV intervention is simply dishonest. Hair shaders are not secret, magic tech. They are drop in fancifiers that these companies peddle like shiny baubles to ignorant tourists while the publishers and developers pocket a neat sum. CDPR and others could easily "roll their own" solutions if they really cared so much about making their games look better. They can read the same research papers coming out of Pixar and Weta and ILM as nVidia and AMD engineers do.
 

ZiggyST

Banned
But the problem with Witcher 3 isn't just with AMD cards, it's with Nvidia cards too. It runs like shit on Kepler based cards. A $200 card (960) is outperforming a $1000 one (6GB Titan). So it seems like Nvidia is purposely crippling their last gen cards to promote their current gen ones.

They learned that from Apple for sure ;)
 

Key2001

Member
Why wasn't TressFX pushed after the use of Hairworks was announced? Because it was already fucking announced. They've been goaded. Nvidia, with it's superior market shares and marketing pressure paved way. AMD could try, but if there's no incentive, then there's no reason. Both AMD's fault for not being able to give incentives and CDProjektRED for being in the need.

I doubt it has anything to do with Nvidia's Hairworks announcement. We would had seen many more games with TressFX if AMD wasn't so unwilling to approach developers about support but somehow we are suppose to believe this was different with CDPR.

Would CDPR had been the exception and AMD would had approached them about TressFX support if they weren't already using Hairworks? I somehow doubt it.

The performance hit is also not much of an excuse for waiting until 2 months from release. Wasn't it already known that Hairworks has a big performance hit for AMD cards from the Far Cry 4 release? It's been known ever longer that Witcher 3 would use Hairworks. There's no excuse for AMD for waiting until 2 months before release before approaching CDRP about TressFX, just as there is not one for them not approaching other devs out there.

It seems to me that the waiting until 2 months from release when they knew Hairworks had a big performance hit before approaching CDPR has more to do with them seeing an opportunity to show up Nvidia than it does about getting TressFX in the game or improving AMD card performance.
 
I'll be honest and to the point with guessing here, but we've got small implications like this occurring with very specific games, if the consumers continue to defend these companies then this situation will get worse.
I fear that in the future if this continues the same way till then, that certain games features or graphics will be specific to only one type of card.

This sucks for all consumers all round regardless of brand and everyone who owns a graphics card should be rejecting these kinds of bullshit practices otherwise we are heading to similar territory as consoles.
 
Bullshit. The fact is these developers manage to ship multiple versions of the same game on consoles that include no code provided by nVidia so the idea that the PC versions can't be competently made without IHV intervention is simply dishonest. Hair shaders are not secret, magic tech. They are drop in fancifiers that these companies peddle like shiny baubles to ignorant tourists while the publishers and developers pocket a neat sum. CDPR and others could easily "roll their own" solutions if they really cared so much about making their games look better. They can read the same research papers coming out of Pixar and Weta and ILM as nVidia and AMD engineers do.

On consoles the console makers are the IHVs that provide anywhere from non-existant to extensive support depending on who you are and what you're doing.

Every advancement that you see in desktop CG today was driven by proprietary extension to existing technology then rolled into the standard once the dust settled and we figured out who won. Then the next version of DX/OGL would come out and the next round will start. Hell, we used to have cap bits that developers would have to query to know if the hardware supported a specific feature in hardware.
 

Aroll

Member
Being an owner of an HD7950, I find this dubious at worst, highly questionable at best.

We are talking about the same AMD that won't let me use the Native driver based supersampling function called VSR because I'm still using a 7000 series graphics card.

Yet the R9 280 is a rebrand of the HD7950. This is the exact same card, The only difference is the name, and AMD had the gall to claim that the R9 series has a hardware scaler that made VSR only possible on the R9 series.

Turns out this so called hardware scaler was just bullshit PR speak, because with a registry edit and custom driver VSR worked on the 7000 series cards like it did on the R9, imagine that. What it narrows down to was a "business decision" that AMD cut off features that FULLY WORK on a revision of graphics cards that was intentionally removed just to make it look the R9 series more appealing to upgrade to.

I would definitely say that it does not make it a more open platform, If anything it says the opposite.

http://forums.guru3d.com/showthread.php?t=397875

By being open, I mean that even without driver updates, you can edit a lot of things and make them work. In fact, you just just proved that point by stating you can get VSR working on 7000 series cards. AMD has a marketing problem. It was obvious they only said any of that to get folks to buy into the R9 series of cards. But, it can be altered and used. There are some features with NVidia that you can't alter at all if they don't allow you to. But we're splitting hairs here.
 

Aroll

Member
Holy fuck they approached CPR to add TressFX two months before the game went gold?!

And from this article: http://arstechnica.co.uk/gaming/201...s-completely-sabotaged-witcher-3-performance/



So they not only didn't approach CPR to add TressFX during the years of the game's development, but they also aren't able to optimize from binaries like Nvidia can? Why do they keep blaming it on source code if Nvidia doesn't even use it for their optimizations?

And how come AMD/ATI's tessellation performance has been so shitty for so many years?

Not sure anyone is taking away the blame AMD gets for not having TressFX in. Chances are, hairworks and tressFX could have both been in there to do the same thing on each card (or even, TressFX chosen over hairworks since it works so well on NVidia cards) if AMD came to them even in the middle of last year. AMD has a PR problem, and a real problem in that it brings things up a bit later than NVidia, who gets their word and tech in from closer to day one.

It's why NVidia crushes AMD in getting full feature sets so often. AMD has to learn from that, as it is a perfectly acceptable thing to do.

I think the problem really is - without the source code, how do you know what to alter? NVidia may not alter the source code for their optimizations, but obviously the source code is running, and the stuff they do alter is to improve how that source code runs. So sure, the code itself may not get altered and they totally adjust something different, but that something different makes the source code run even better. Without knowing exactly how that source code functions, since AMD can't look at it, they cannot make proper adjustments. Essentially, they have to guess. "Adjust this - does it run better now? No? How about this? No? Okay, now this? etc."

They are literally shooting in the dark, hoping it suddenly runs better off guess work. Having the source code isn't just about having the ability to edit that code and make it run better on your hardware, it;s about understanding everything that code is doing, so you can adjust other areas of your drivers and tech to run that code more efficiently. So sure, NVidia's fire back seems sincere on the surface, but reality is that they may not need to touch that code, but they know what that code is doing. AMD does not. Guess work leads to inferior ability to adjust. TressFX, as poorly managed as it has been from a PR stand point, is right there, so NVIdia can see what it does, how it does it, and adjust.

Reality is, we're running in two different approaches to the market and yes, AMD and Nvidia are both making mistakes. The problem most have with Nividia is how they are trending. They are slowly making things proprietary. This means you HAVE to get Nvidia hardware for certain things, and that's just the start - eventually, entire games could require their hardware. It's like saying when you browse the internet, you can only watch videos on half the websites out there in HD on Project Spartan, and in SD in FF and everything else. Then, over time, suddenly those videos in 5 years ONLY play on Project Spartan. Then, the entire website only loads on it. Meanwhile, the other browser users are left out.

PC has always been about open source. NVidia is choosing to stop doing that because they have a strangle hold and they know AMD is not marketing right. They are trying to force them out of the market.

Obviously, AMD is successful enough to not be in a out of market situation, but people are going to prefer open source versus proprietary. AMD has a lot of work to do - and I applaud them for HBM - really hoping they stay on top of that tech now that they have a head start. They failed to do so with GDDR5, they need to do it with HBM. But that's just a start. AMD has their hands in a lot of cookie jars. They should really start pulling an NVidia and absolutely NAIL one of the avenues they are in, THEN trickle that down to the rest.
 

PnCIa

Member
Reading this thread makes me shake my head sometimes. Sure AMD has a lot of work to do as was stated many times, and they are fucking it up a lot more atm than Nvidia.

But whats happening with the whole proprietary stuff, be it hairworks or tressfx, could, if it gets extreme, catapult us back to the 90s with separated rendering paths and feature sets for different GPUs. That would be the worst thing to happen. Therefore, what Nvidia is doing right now, and be it only because they can, is a very, very bad thing.
 

NeOak

Member
Reading this thread makes me shake my head sometimes. Sure AMD has a lot of work to do as was stated many times, and they are fucking it up a lot more atm than Nvidia.

But whats happening with the whole proprietary stuff, be it hairworks or tressfx, could, if it gets extreme, catapult us back to the 90s with separated rendering paths and feature sets for different GPUs. That would be the worst thing to happen. Therefore, what Nvidia is doing right now, and be it only because they can, is a very, very bad thing.

Oh. The dark days of Glide.
 

Renekton

Member
But whats happening with the whole proprietary stuff, be it hairworks or tressfx, could, if it gets extreme, catapult us back to the 90s with separated rendering paths and feature sets for different GPUs. That would be the worst thing to happen. Therefore, what Nvidia is doing right now, and be it only because they can, is a very, very bad thing.
Unlikely, in the end Gameworks will have to work around DX12 or Vulkan instead of standalone. Worse case is we just miss features like shampoo hair and power rangers physics. Microsoft can also add cloth simulation and tesselation support to DirectX, unless Nvidia sues them haha
 
Top Bottom