I think one of the important things about this is that this is not something that can't be seen and that can only be measured in graphs. Get the wrong kind of GPU setuip on the wrong kind of game, and you can easily see it. I saw it in Fallout 3 with my 4850 Crossfire setup (but didn't know what it was back then), I saw it in Skyrim and New Vegas with my GTX460 SLI setup, and according to that gameplay video that was slightly slowed down, you can pretty easily see and feel it in games like Skyrim with the 7950. Now, sure, all of these examples are the most obvious inside the rather geriatric GameBryo/Creation engine, but hey, not every game is made on the most efficient and compatible engines out there. Even if you don't like Bethesda games, something's going to come along that only delivers smooth frametimes on the cards and drivers that specifically are made for reducing that - if AMD and Nvidia aren't vigilant, this could become a real problem again.
I'm sure plenty of people will stick their fingers in their ears and say BETTER FPS LALA I CAN'T HEAR YOU, but luckily at least both AMD and Nvidia are now keeping this in mind. GPU developers make their products to sell, and in the past they have most definitely optimized their cards for best frame rates since that's been the biggest metric for measuring performance, but now there's a new metric to make sure that they're consistent. You don't want consistency? That's cool, don't demand it.