This thread has taken a turn for the stupid.
1. Graphic cards are not just for playing games, which is why when they get tested, you have a wide variety of different benchmarks available. They are used for different tasks, even though PC gaming is a priority for may of them.
2. Many games do take full advantage of the hardware available to them, but PC developers also need to make the games scale with different type of machines. The amount of people that own the top of the line equipment are few and far between. All you need to do is head to a developers PC technical support forum and it's easy to see the mass majority of people playing games aren't doing so with the most up to date cards and CPU's.
3. Why is anyone angry that the PC is constantly getting improved hardware? Do you need to upgrade every time? no, of course not. However it's an options for those who want to. New cards are not useless, and they perform many times better then older ones. They offer better visuals at higher resolutions. They improve with other computer technology. The new generation of cards will support DX11, higher resolution monitors which are now becoming dirt cheap, and offer support for HDMI and Display port. They have new tech inside of them for HD movies, rendering and various other things people in this thread are not even taking into account. New cards are not just about more performance, they are about more features.
4. Some people in this thread seem to be mad at NVIDIA and ATI for developers not taking advantage of their products. It simply does not make financial sense for any developer to create a game that takes full advantage of a specific series and line of cards coupled with other high end hardware. Not enough people have this equipment. What they do, do is offer scalable settings so those who are lucky enough to have the latest gear can get the mot out of it.
5. Killzone 2 is garbage compared to many PC games out there. It's visuals are also not on-par with that of Crysis. I also read the argument someone made of "why don't developers take advantage of the hardware and make a game at 30 FPS, 720p resolution". They then dared to say they have been playing PC games for a long time. 30 FPS @ 720 looks like fucking dog shit on a computer. Even on a 1680X1050 monitor changing the resolution to 720p makes you want to vomit. Lets also not take into account 1900X1200 and 1080p monitors are starting to become dirt cheap. They are also different aspect ratios, some are 16:9, others 16:10. You put a 30FPS, 720p image on even a $199 1080p monitor and be prepared to puke all over yourself.
6. The reason developers can get so much out of consoles is because they are designed specifically for gaming and gaming only. Computers are not, and are using a lot of the power just to maintain themselves, thanks to not only PC programming but the OS itself.
7. Like someone said, PC tech evolves quickly, but you don't need to upgrade with it. Personally I still use an E8400, 4gb of DDR2 ram and a 9800GTX with a 1680X1050 monitor. I completely skipped this generation of cards, and I normally upgrade every other generation. I still got to play every game I wanted, on max settings or close to it. However, if you compared my PC to one with a 4890 @ 1080p it would be night and day. So someone who did decide to upgrade for their moneys worth, and so did I considering I was able to skip a generation of cards and do everything I wanted. My next upgrade will be to an I5 machine, running one of ATI's new cards and a 1080P monitor.
If someone wants to get into a real debate with me about graphic cards let me know, we can break down every single feature they provide and I will show you why it makes sense for ATI and NVIDIA to keep evolving.