I don't know about this.
I do not consider the current state of PC gaming to be better than it was from the early nineties to the early 00's. The notion that PC gaming has not always been the technological arms race that it was as recently as four years ago is categorically false; it is merely the case that the arms race could no longer be conducted by basement coders and that at some point, something had to give.
I consider the current state of PC game to be a compromise based on the realization that it is impossible given the continuing advancement of technology for PC gaming to ever be as versatile and bleeding edge-oriented as it was in the nineties. The innevitable rise in development budgets (a rise that was accompanied by rising consumer standards) means that the bleeding edge is now entirely unpractical for inexpensive development. I don't see how this could possibly perceived as a good thing; inevitable as it is, this element has contributed to the lack of variety in titles that explicitly aim to push the available hardware. PC gaming has always been known for its relative variety due to the diversity of development budgets and accessibility of the platform; RTS games were (and mostly still are) unheard of on consoles. TBS games, HnS games, roguelikes, war simulation games, flight simulators, train conducting games: These are genres that live on the PC and have lived there for awhile. The presence of a second-tier development crew (second-tier referring to popularity, not quality) is not new to the PC; what is new to the PC (and by new, I mean from about 2003 onward) is the presence of a gap between what is technically within reach of a second-tier team and a first-tier team.
I also do not hold NVidia and ATI particularly responsible for this either because there has, for as long as I have been involved in PC gaming, always been an actor pushing the definition of high-end and these actors have subsequently driven the graphics fidelity of the games that developers produced. In fact, both NVidia and ATI contributed greatly to reducing the price of the high-end; a high end PC now costs a fifth of what it did in 1998.
I would argue that we are better than we were five years ago but worse than we were ten years ago. During the PS2/XBox/GCN era, I felt that PC gaming suffered an identity crisis where it couldn't handle the fact that development costs had gotten too high for every game out there to meet a certain level of (graphical/performance) quality and that as a result the development of these interesting and unique games halted because they became too expensive to develop (with the requirement that they remain bleeding edge). In recent years we (PC gamers) have conceded that these second-tier developers cannot and will not ever be able to push the limits of the available technology but that that is okay, and we have come to the concession largely because these developers are producing fantastic and interesting games that would never appear anywhere else which, by and large, is the element that has largely been responsible for the loyal following that PC gaming commands. It is still a concession though, and I have a hard time seeing it as anything else.
This post repeats itself a lot (I'm very tired) so I thank you if you've made it all the way through it and better yet if you understand what I'm trying to say.