• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ex-NVIDIA Driver Developer on Why Every Triple-A Game Ships Broken & Multi-GPUs

GameDev’s forum member ‘Promit‘ (who was an ex-NVIDIA driver developer) claimed that nearly every triple-A game ships broken (or at least shipped broken when he was working for the green team):

“Nearly every game ships broken. We’re talking major AAA titles from vendors who are everyday names in the industry. In some cases, we’re talking about blatant violations of API rules – one D3D9 game never even called BeginFrame/EndFrame. Some are mistakes or oversights – one shipped bad shaders that heavily impacted performance on NV drivers. These things were day to day occurrences that went into a bug tracker. Then somebody would go in, find out what the game screwed up, and patch the driver to deal with it. There are lots of optional patches already in the driver that are simply toggled on or off as per-game settings, and then hacks that are more specific to games – up to and including total replacement of the shipping shaders with custom versions by the driver team. Ever wondered why nearly every major game release is accompanied by a matching driver release from AMD and/or NVIDIA? There you go.”

Promit shared his opinion about a lot of topics, and talked a bit about multi-GPUs. As NVIDIA’s former developer said, multi-GPUs are f’ing complicated.

“You cannot begin to conceive of the number of failure cases that are involved until you see them in person. I suspect that more than half of the total software effort within the IHVs is dedicated strictly to making multi-GPU setups work with existing games. (And I don’t even know what the hardware side looks like.) If you’ve ever tried to independently build an app that uses multi GPU – especially if, god help you, you tried to do it in OpenGL – you may have discovered this insane rabbit hole. There is ONE fast path, and it’s the narrowest path of all.”

http://www.dsogaming.com/news/ex-nv...-every-triple-a-games-ship-broken-multi-gpus/
 
So basically, big developers offload some of their bug fixing work to the GPU vendors by not bothering to fix bugs that can be fixed by driver changes? Seems pretty logical, if shitty.
 

Kezen

Banned
Note : this devs was talking about the Vista period.
Many years ago, I briefly worked at NVIDIA on the DirectX driver team (internship). This is Vista era, when a lot of people were busy with the DX10 transition
 

BibiMaghoo

Member
Why would they spend so much time and effort fixing other peoples games?

They didn't make them, nor are they responsible. I don't understand why they would do that.
 

No_Style

Member
Why would they spend so much time and effort fixing other peoples games?

They didn't make them, nor are they responsible. I don't understand why they would do that.

It's a PR move. If NVIDIA doesn't fix them but AMD does, then NVIDIA GPUs look bad because they can't run Game X on launch day. Then there are instances where it's an "NVIDIA" or "AMD" game" and then you have no choice but to ensure that it works.

The only way the GPU manufacturers win is if they are united against this shitty practice.
 

wheapon

Member
Why would they spend so much time and effort fixing other peoples games?

They didn't make them, nor are they responsible. I don't understand why they would do that.

Imagine that a given game runs really well on one gpu platform but very poorly on another. If you were interested in that game, the performance stats could influence your purchase. I'd imagine that's part of the reason nvidia does this.
 

trw

Member
Why would they spend so much time and effort fixing other peoples games?

They didn't make them, nor are they responsible. I don't understand why they would do that.

When they fix it and AMD doesn't you create the current situation: AMD has a reputation of having bad drivers and Nvidia has a huge portion of the marketshare.
 
So basically, big developers offload some of their bug fixing work to the GPU vendors by not bothering to fix bugs that can be fixed by driver changes? Seems pretty logical, if shitty.

No, big developers offload some of their bug fixing work to the GPU vendors because the driver is a gigantic black box that devs can't inspect into. I think Valve talked a lot about this when they were doing their port to Linux, where they were able to improve the state-of-affairs across the board because they were able to look at their game's source, the driver's source, the OS's source, all side-by-side, with the driver team there. If you actually read the post, Promit goes into this situation of "games guessing what the driver should be doing while driver guessing what the games should be doing" causing a giant clusterfuck in greater detail.
 

Swifty

Member
You guys should read the entire post. A lot of the problem is structural. DirectX and OpenGL just cause a lot of headaches because they promise to do all this abstraction but end up having so many pitfalls that cause developers to accidentally trigger "slow paths" in their code.

That's why DX12 and Vulkan are so important because developers finally get a transparent look at what commands and data get sent to the GPU.

Many years ago, I briefly worked at NVIDIA on the DirectX driver team (internship). This is Vista era, when a lot of people were busy with the DX10 transition, the hardware transition, and the OS/driver model transition. My job was to get games that were broken on Vista, dismantle them from the driver level, and figure out why they were broken. While I am not at all an expert on driver matters (and actually sucked at my job, to be honest), I did learn a lot about what games look like from the perspective of a driver and kernel.

The first lesson is: Nearly every game ships broken. We're talking major AAA titles from vendors who are everyday names in the industry. In some cases, we're talking about blatant violations of API rules - one D3D9 game never even called BeginFrame/EndFrame. Some are mistakes or oversights - one shipped bad shaders that heavily impacted performance on NV drivers. These things were day to day occurrences that went into a bug tracker. Then somebody would go in, find out what the game screwed up, and patch the driver to deal with it. There are lots of optional patches already in the driver that are simply toggled on or off as per-game settings, and then hacks that are more specific to games - up to and including total replacement of the shipping shaders with custom versions by the driver team. Ever wondered why nearly every major game release is accompanied by a matching driver release from AMD and/or NVIDIA? There you go.

The second lesson: The driver is gigantic. Think 1-2 million lines of code dealing with the hardware abstraction layers, plus another million per API supported. The backing function for Clear in D3D 9 was close to a thousand lines of just logic dealing with how exactly to respond to the command. It'd then call out to the correct function to actually modify the buffer in question. The level of complexity internally is enormous and winding, and even inside the driver code it can be tricky to work out how exactly you get to the fast-path behaviors. Additionally the APIs don't do a great job of matching the hardware, which means that even in the best cases the driver is covering up for a LOT of things you don't know about. There are many, many shadow operations and shadow copies of things down there.

The third lesson: It's unthreadable. The IHVs sat down starting from maybe circa 2005, and built tons of multithreading into the driver internally. They had some of the world's best kernel/driver engineers in the world to do it, and literally thousands of full blown real world test cases. They squeezed that system dry, and within the existing drivers and APIs it is impossible to get more than trivial gains out of any application side multithreading. If Futuremark can only get 5% in a trivial test case, the rest of us have no chance.

The fourth lesson: Multi GPU (SLI/CrossfireX) is fucking complicated. You cannot begin to conceive of the number of failure cases that are involved until you see them in person. I suspect that more than half of the total software effort within the IHVs is dedicated strictly to making multi-GPU setups work with existing games. (And I don't even know what the hardware side looks like.) If you've ever tried to independently build an app that uses multi GPU - especially if, god help you, you tried to do it in OpenGL - you may have discovered this insane rabbit hole. There is ONE fast path, and it's the narrowest path of all. Take lessons 1 and 2, and magnify them enormously.

Deep breath.

Ultimately, the new APIs are designed to cure all four of these problems.
* Why are games broken? Because the APIs are complex, and validation varies from decent (D3D 11) to poor (D3D 9) to catastrophic (OpenGL). There are lots of ways to hit slow paths without knowing anything has gone awry, and often the driver writers already know what mistakes you're going to make and are dynamically patching in workarounds for the common cases.
* Maintaining the drivers with the current wide surface area is tricky. Although AMD and NV have the resources to do it, the smaller IHVs (Intel, PowerVR, Qualcomm, etc) simply cannot keep up with the necessary investment. More importantly, explaining to devs the correct way to write their render pipelines has become borderline impossible. There's too many failure cases. it's been understood for quite a few years now that you cannot max out the performance of any given GPU without having someone from NVIDIA or AMD physically grab your game source code, load it on a dev driver, and do a hands-on analysis. These are the vanishingly few people who have actually seen the source to a game, the driver it's running on, and the Windows kernel it's running on, and the full specs for the hardware. Nobody else has that kind of access or engineering ability.
* Threading is just a catastrophe and is being rethought from the ground up. This requires a lot of the abstractions to be stripped away or retooled, because the old ones required too much driver intervention to be properly threadable in the first place.
* Multi-GPU is becoming explicit. For the last ten years, it has been AMD and NV's goal to make multi-GPU setups completely transparent to everybody, and it's become clear that for some subset of developers, this is just making our jobs harder. The driver has to apply imperfect heuristics to guess what the game is doing, and the game in turn has to do peculiar things in order to trigger the right heuristics. Again, for the big games somebody sits down and matches the two manually.

Part of the goal is simply to stop hiding what's actually going on in the software from game programmers. Debugging drivers has never been possible for us, which meant a lot of poking and prodding and experimenting to figure out exactly what it is that is making the render pipeline of a game slow. The IHVs certainly weren't willing to disclose these things publicly either, as they were considered critical to competitive advantage. (Sure they are guys. Sure they are.) So the game is guessing what the driver is doing, the driver is guessing what the game is doing, and the whole mess could be avoided if the drivers just wouldn't work so hard trying to protect us.

So why didn't we do this years ago? Well, there are a lot of politics involved (cough Longs Peak) and some hardware aspects but ultimately what it comes down to is the new models are hard to code for. Microsoft and ARB never wanted to subject us to manually compiling shaders against the correct render states, setting the whole thing invariant, configuring heaps and tables, etc. Segfaulting a GPU isn't a fun experience. You can't trap that in a (user space) debugger. So ... the subtext that a lot of people aren't calling out explicitly is that this round of new APIs has been done in cooperation with the big engines. The Mantle spec is effectively written by Johan Andersson at DICE, and the Khronos Vulkan spec basically pulls Aras P at Unity, Niklas S at Epic, and a couple guys at Valve into the fold.

Three out of those four just made their engines public and free with minimal backend financial obligation.

Now there's nothing wrong with any of that, obviously, and I don't think it's even the big motivating raison d'etre of the new APIs. But there's a very real message that if these APIs are too challenging to work with directly, well the guys who designed the API also happen to run very full featured engines requiring no financial commitments. So that's served to considerably smooth the politics involved in rolling these difficult to work with APIs out to the market.

The last piece to the puzzle is that we ran out of new user-facing hardware features many years ago. Ignoring raw speed, what exactly is the user-visible or dev-visible difference between a GTX 480 and a GTX 980? A few limitations have been lifted (notably in compute) but essentially they're the same thing. MS, for all practical purposes, concluded that DX was a mature, stable technology that required only minor work and mostly disbanded the teams involved. Many of the revisions to GL have been little more than API repairs. (A GTX 480 runs full featured OpenGL 4.5, by the way.) So the reason we're seeing new APIs at all stems fundamentally from Andersson hassling the IHVs until AMD woke up, smelled competitive advantage, and started paying attention. That essentially took a three year lag time from when we got hardware to the point that compute could be directly integrated into the core of a render pipeline, which is considered normal today but was bluntly revolutionary at production scale in 2012. It's a lot of small things adding up to a sea change, with key people pushing on the right people for the right things.


Phew. I'm no longer sure what the point of that rant was, but hopefully it's somehow productive that I wrote it. Ultimately the new APIs are the right step, and they're retroactively useful to old hardware which is great. They will be harder to code. How much harder? Well, that remains to be seen. Personally, my take is that MS and ARB always had the wrong idea. Their idea was to produce a nice, pretty looking front end and deal with all the awful stuff quietly in the background. Yeah it's easy to code against, but it was always a bitch and a half to debug or tune. Nobody ever took that side of the equation into account. What has finally been made clear is that it's okay to have difficult to code APIs, if the end result just works. And that's been my experience so far in retooling: it's a pain in the ass, requires widespread revisions to engine code, forces you to revisit a lot of assumptions, and generally requires a lot of infrastructure before anything works. But once it's up and running, there's no surprises. It works smoothly, you're always on the fast path, anything that IS slow is in your OWN code which can be analyzed by common tools. It's worth it.
 

_machine

Member
You guys should read the entire post. A lot of the problem is structural. DirectX and OpenGL just cause a lot of headaches because they promise to do all this abstraction but end up having so many pitfalls that cause developers to accidentally trigger "slow paths" in their code.

That's why DX12 and Vulkan are so important because developers finally get a transparent look at what commands and data get sent to the GPU.
Yeah, people are seriously jumping to conclusions here. Driver-level debugging is simply near-impossible at the moment and that leads to broken code that can't be easily fixed without the vendors assistance. I've heard that it was especially bad during that era, and I'm sure that the situtation has still improved by miles since crossplatform development is much more common and tools have improved lots since 2007.
 
I think the title of ex-Nvidia driver developer is a bit off, he was a paid intern who was there many years ago during the dx10 transition during Vista. The title makes it sound more than it is. Not doubting the general assessment but the surity of the title makes it sound like a high up who just left yesterday.
 

Eirulan

Member
very interesting read, thank you for sharing.
I already figured the main reason nearly every "big" game gets a "game ready" driver is to provide game-specific bugfixes and code exceptions.
 

_machine

Member
I think the title of ex-Nvidia driver developer is a bit off, he was a paid intern who was there many years ago during the dx10 transition during Vista. The title makes it sound more than it is. Not doubting the general assessment but the surity of the title makes it sound like a high up who just left yesterday.
True, but I think "Triple-A Game Ships Broken" is far worse as most people have misunderstood the whole article just from reading the title (which does sound a bit sensationalistic and easily misinterpreted). This about the whole drivers and API being a major pain in the ass pretty much forcing developers to ship broken titles to IHV's to fix and it's mainly about why Vulkan is happening. Hardly anything to do with the quality of titles released lately (and this 2007, when the API/Driver problem was even larger because of DX9/DX10 and OpenGL quality).
 

rambis

Banned
This has always been a problem with PC development as its been structured.

Yeah, people are seriously jumping to conclusions here. Driver-level debugging is simply near-impossible at the moment and that leads to broken code that can't be easily fixed without the vendors assistance. I've heard that it was especially bad during that era, and I'm sure that the situtation has still improved by miles since crossplatform development is much more common and tools have improved lots since 2007.
Of the 10 posts prior to this one, who jumped to conclusions?
 

_machine

Member
Of the 10 posts prior to this one, who jumped to conclusions?
I'm pretty sure these posts misunderstood the post and jumped to conclusions (as in blaming developers for situation without considering the big picture):
So basically, big developers offload some of their bug fixing work to the GPU vendors by not bothering to fix bugs that can be fixed by driver changes? Seems pretty logical, if shitty.
You know, it doesn't seem to different now. In fact it's worse -- devs are offloading work onto the consumers! Praise be to Durante.
Why would they spend so much time and effort fixing other peoples games?

They didn't make them, nor are they responsible. I don't understand why they would do that.
It's a PR move. If NVIDIA doesn't fix them but AMD does, then NVIDIA GPUs look bad because they can't run Game X on launch day. Then there are instances where it's an "NVIDIA" or "AMD" game" and then you have no choice but to ensure that it works.

The only way the GPU manufacturers win is if they are united against this shitty practice.
None of those posts address that this is a problem in the API/Driver level and isn't just (once again) about developers being lazy or "shitty" or whatever. The original post is about a situation that was caused by a multitude of problems and is something that Vulkan is supposed to fix.
 

rambis

Banned
I'm pretty sure these posts misunderstood the post and jumped to conclusions (as in blaming developers for situation without considering the big picture):




None of those posts address that this is a problem in the API/Driver level and isn't just (once again) about developers being lazy or "shitty" or whatever. The original post is about a situation that was caused by a multitude of problems and is something that Vulkan is supposed to fix.
Trying to label it all structural issues seem like its just a big a misunderstanding as anything. Some of what he details (bad syntax or unoptimal shaders) seem like they could've been remedied at the developer level but were overlooked. So I think those are some valid criticisms. But no it shouldn't have been titled the way it was.
 

Durante

Member
Yeah, this (or rather, the original forum post, which doesn't really need paraphrasing) was posted in another thread. Not really anything new if you have been following this closely, but more people reading the original post can only be a good thing.

Why would they spend so much time and effort fixing other peoples games?
Competitive advantage.

At this point it's basically impossible for anyone not AMD or Nvidia to even try and enter the PC graphics market, even if they had superior hardware. Because they don't have an accumulated storage of millions of lines of game-specific driver optimizations and fixes.
 

_machine

Member
Trying to label it all structural issues seem like its just a big a misunderstanding as anything. Some of what he details (bad syntax or unoptimal shaders) seem like they could've been remedied at the developer level but were overlooked. So I think those are some valid criticisms. But no it shouldn't have been titled the way it was.
Off course developers certainly aren't without a fault and are certainly responsible for obvious mistakes (though they rarely are so obvious during development). But as mentioned by himself (being a developer as well), the limited possibilites of debugging make it really hard to correct it (neither do they really get enough feedback from the issues fixed in the driver level to improve the process next time) and that combined with lackluster documentation was by far the biggest reason for the issues back then. Things have changed for the better, but IVH's stepping up to fix problems is still mainly a symptom of the tools available to the developers (especially given the high usage of middleware these days).
 
I've typically stayed with what I'm comfortable with on the OS side up until recently, however testimony like this makes me want to help support transition away from legacy APIs (preservation and archiving excepted) and towards newer standards.

The timing has never been better for proper upheaval at a lower level in PC gaming.
 

Coconut

Banned
Why would they spend so much time and effort fixing other peoples games?

They didn't make them, nor are they responsible. I don't understand why they would do that.

Because posts that read "I'm not seeing any issues on my nVidia card" sell nVidia cards.


Imagine how awesome your card would be running if they weren't trying to fix broken games.
 

Tetranet

Member
Competitive advantage.

At this point it's basically impossible for anyone not AMD or Nvidia to even try and enter the PC graphics market, even if they had superior hardware. Because they don't have an accumulated storage of millions of lines of game-specific driver optimizations and fixes.

Hmm, and what does that mean for the promise of eternal compatibility on the PC? It makes the whole affair significantly more unstable.
 

Durante

Member
Hmm, and what does that mean for the promise of eternal compatibility on the PC? It makes the whole affair significantly more unstable.
Not really, or perhaps only very slightly. I'd wager that the vast majority of any game specific driver code is either for performance optimization or "hacks" like SLI -- if you just want to play an old game you don't need either of those.
 
Fun fact: if you set the object name on a D3D resource at runtime in your game in order to improve the semantic richness of, say, a frame capture, the driver team has no way to access that object name. The number of times we've been talking to an IHV in the form of them saying "yeah, it's a 1M buffer, and you do a Map of it in the middle of the frame, update it, and then it gets copied to another resource, and that's causing a weird sync point because of an issue in our driver because you have a UAV mapped on it, too, and then it gets doubly screwed on multi-GPU because of the late push, so, yeah, do you know what resource that is?" is too damn high.

But yes, we have entered a glorious new era of APIs and eventually it will all shake out to a much better situation. Onward and upward.
 

Faustek

Member
Thanks for sharing.

Yes, this was during the Vista(I still shudder) era but I do believe it's still mucks around in the same waters. Nothing to do about it really. Just hope your vendor is quick with the updates in case of a *mishap*.
 

mrklaw

MrArseFace
Is this partly why GPU drivers are so large? Because they contain a cumulative mass of patches and fixes for every game hat needs them, going back years?
 

dr_rus

Member
At this point it's basically impossible for anyone not AMD or Nvidia to even try and enter the PC graphics market, even if they had superior hardware. Because they don't have an accumulated storage of millions of lines of game-specific driver optimizations and fixes.

I dunno about that. These fixes are architecture specific for the most part and both NV and AMD tend to switch to new architectures which do break these fixes more often than not. Then they fix the fixes which kinda indicate that there isn't THAT much of really critical stuff in the drivers. Most fixes are generic or simply become obsolete on newer more powerful hardware because the hardware is fast enough anyway. So to have a go on the market from a new player all he really needs is to optimize for a couple of dozens of up to date titles, half of which are used in review benchmarks. That's not much.
 
I dunno about that. These fixes are architecture specific for the most part and both NV and AMD tend to switch to new architectures which do break these fixes more often than not. Then they fix the fixes which kinda indicate that there isn't THAT much of really critical stuff in the drivers. Most fixes are generic or simply become obsolete on newer more powerful hardware because the hardware is fast enough anyway. So to have a go on the market from a new player all he really needs is to optimize for a couple of dozens of up to date titles, half of which are used in review benchmarks. That's not much.

I assume all the fixes sit on top of their internal hardware abstraction layer. They get ported to new architectures as hardware enablement for new architectures is done.

They don't rewrite the entire driver for every new GPU...
 

dr_rus

Member
I assume all the fixes sit on top of their internal hardware abstraction layer. They get ported to new architectures as hardware enablement for new architectures is done.

They don't rewrite the entire driver for every new GPU...

As I've said these fixes are mostly arch specific. You can't "port" them to a new architecture because that destroys the point of their existence. And I really really doubt that these fixes are stored in some h/w abstraction layer.

They do rewrite a significant portion of their drivers for each new architecture.
 
Is this partly why GPU drivers are so large? Because they contain a cumulative mass of patches and fixes for every game hat needs them, going back years?
Yes, GPU drivers are sort of like those broken "per-game speed hack" emulators, with the obvious difference being they are the target platform.

I don't think they enumerate games though, I know nothing obviously so feel free to correct me but I'm going to guess they detect certain optimizable (?) cases through call sequences.
 

dr_rus

Member
Is this partly why GPU drivers are so large? Because they contain a cumulative mass of patches and fixes for every game hat needs them, going back years?

No. The DirectX graphics driver itself in the latest NV's driver package is less than 20 MB out of ~520 MB of everything else. All else are localization files and images, OpenCL/OpenGL drivers, 3D vision drivers, GeForce Experience, control panel, Physx, etc.
 

Renekton

Member
Ever wondered why nearly every major game release is accompanied by a matching driver release from AMD and/or NVIDIA? There you go.
Won't drivers become insanely bloated due to the sheer number of specific use cases for every AAA release?
 

_machine

Member
I guess this explains why a lot of smaller releases come out in fairly decent shape, or get fixes patched by the developer quickly. They aren't able to force the hand of the GPU guys.
Please read the full post (not the article that is a bit out of context), it has little to do with common gameplay bugs; it's about situation in 2007 when most developers used in-house engines and needed that support from vendors. This hardly concerns today's indie developers as they mostly middleware that has already gone through that phase.

Is this partly why GPU drivers are so large? Because they contain a cumulative mass of patches and fixes for every game hat needs them, going back years?
Yup, it's there in the original post as well:
"The second lesson: The driver is gigantic. Think 1-2 million lines of code dealing with the hardware abstraction layers, plus another million per API supported. The backing function for Clear in D3D 9 was close to a thousand lines of just logic dealing with how exactly to respond to the command. It'd then call out to the correct function to actually modify the buffer in question. The level of complexity internally is enormous and winding, and even inside the driver code it can be tricky to work out how exactly you get to the fast-path behaviors. Additionally the APIs don't do a great job of matching the hardware, which means that even in the best cases the driver is covering up for a LOT of things you don't know about. There are many, many shadow operations and shadow copies of things down there."
 

orava

Member
I know it would be pretty much impossible thing to do but i wish the gpu manufacturers would just refuse to run a game that would need fixing from their side.
 

patapuf

Member
I know it would be pretty much impossible thing to do but i wish the gpu manufacturers would just refuse to run a game that would need fixing from their side.

I prefer it when my games run well as early as possible.

To me, it doesn't really matter if it was Nvidia or the dev that fixed it.
 

_machine

Member
I know it would be pretty much impossible thing to do but i wish the gpu manufacturers would just refuse to run a game that would need fixing from their side.
Please read the full post (not the article), it was absolutely necessary back then because of the API and Driver tools being what they were and even today for engine developers it's not really out of shitty behaviour, but rather the situation being really complicated for all parties. Developers are not without fault, but this isn't something they could realistically fix alone and is a massively complicated issue. That is Vulkan is being developed; it addresses these problems.

The last piece to the puzzle is that we ran out of new user-facing hardware features many years ago. Ignoring raw speed, what exactly is the user-visible or dev-visible difference between a GTX 480 and a GTX 980? A few limitations have been lifted (notably in compute) but essentially they're the same thing. MS, for all practical purposes, concluded that DX was a mature, stable technology that required only minor work and mostly disbanded the teams involved. Many of the revisions to GL have been little more than API repairs. (A GTX 480 runs full featured OpenGL 4.5, by the way.) So the reason we're seeing new APIs at all stems fundamentally from Andersson hassling the IHVs until AMD woke up, smelled competitive advantage, and started paying attention. That essentially took a three year lag time from when we got hardware to the point that compute could be directly integrated into the core of a render pipeline, which is considered normal today but was bluntly revolutionary at production scale in 2012. It's a lot of small things adding up to a sea change, with key people pushing on the right people for the right things.


Phew. I'm no longer sure what the point of that rant was, but hopefully it's somehow productive that I wrote it. Ultimately the new APIs are the right step, and they're retroactively useful to old hardware which is great. They will be harder to code. How much harder? Well, that remains to be seen. Personally, my take is that MS and ARB always had the wrong idea. Their idea was to produce a nice, pretty looking front end and deal with all the awful stuff quietly in the background. Yeah it's easy to code against, but it was always a bitch and a half to debug or tune. Nobody ever took that side of the equation into account. What has finally been made clear is that it's okay to have difficult to code APIs, if the end result just works. And that's been my experience so far in retooling: it's a pain in the ass, requires widespread revisions to engine code, forces you to revisit a lot of assumptions, and generally requires a lot of infrastructure before anything works. But once it's up and running, there's no surprises. It works smoothly, you're always on the fast path, anything that IS slow is in your OWN code which can be analyzed by common tools. It's worth it.
 

Joni

Member
I know it would be pretty much impossible thing to do but i wish the gpu manufacturers would just refuse to run a game that would need fixing from their side.

I wish developers refused to support companies whose drivers don't work so they don't need to spend that much money on shitty versions. You wouldn't get PC versions in either case. Both statements are just about as clueless as one another.
 

vato_loco

Member
You know, other than simply understanding how little I know of game development, this kind of article just makes me think that it's kind of amazing that games work at all, with so many things that could go horribly wrong.
 

mclem

Member
Why would they spend so much time and effort fixing other peoples games?

They didn't make them, nor are they responsible. I don't understand why they would do that.

It reflects on their product, and in that field they can't afford to be seen as 'clearly inferior'.
 

BigDug13

Member
Seems like this paints an even more grim picture for AMD card users when they don't release drivers nearly as fast as Nvidia. If every AAA game needs driver work and you don't see a new driver for months. That's not good.
 

ZiggyST

Banned
As a mobile developer I don't understad one thing.
Where do they test they games then?
Do they use a beta driver or something like that? It doesn't make sense.
 
Note : this devs was talking about the Vista period.

nothings changed tho. pc versions of most multiplatform titles are broken rubbish. probably because they are always outsourced to some crappy, talentless and incompetent studio(ubisoft kiev anyone?)
 
Top Bottom