lets say it this way.. could the highest end Graphics Card of 2006 when the PS3 came out play Last of Us at that level of quality?? (when Oblivion and F.E.A.R where good looking games pushing PC's)
i doubt it.. there is alot working in favor of consoles when it comes to optimization. right now high end graphic cards are better and will easily outperform any currently released next gen console game.. but as console games keep being optimized the same PC graphic cards of that time would have trouble getting the same optimization from developers.
In contrast, the PS3 version of Oblivion was running at 1280x720, ~25 fps with HDR and no AA. Oblivion was pushing the PS3 pretty hard, not so much then-high-end cards like the 8800GTX and x1950XTX except for the fact that PC gaming was already moving over to higher resolution standards and therefore required stronger video cards. This cannot be emphasized enough as it is far too often left out of these comparisons: 1280x720 was never a standard resolution for PC gaming. 1280x1024 was common at the time as was 1600x1200. The move to widescreen PC gaming, even back in 2006, was pretty much immediately to 1400x900/1680x1050/1920x1080/1920x1200. Gaming PCs commanded a resolution advantage pretty much the entirety of the last console generation, with the usual higher demands in framerate and AA (another thing the PS3 sparsely did in any capacity before the advent of post-processing AA solutions). Naturally, PC gamers, as we generally do, gravitated to higher-power graphical solutions, especially relatively revolutionary ones. When the G80 (8800 cards) started to have very affordable variants in 2007 with huge and immediately noticeable performance benefits (especially with Crysis, higher resolutions, DirectX 10 features, and the popularization of unified shader architectures, not to mention the introduction of CUDA on Nvidia's side) and beyond, it was pretty much a no-brainer to move on to stronger and affordable new tech. Crysis, even after it was recreated for the consoles only a couple years ago with the latest CryEngine never ran anywhere near as well as variants of the 8800 managed the game and the 8800GTX (as well as the 8800GT) ran Battlefield 3, Crysis 2, and every other recent multiplatform people still cared to run on the thing far better than either console.
Guesswork and doubting based on pretty much nothing, let alone an understanding of what kind of advancements PC GPU technology were making at the time, is not a quantifiable, sound method of determining what a given GPU can or cannot do in a given environment. I guarantee you, like everything else on both consoles and PCs that 8800 GTX/GTs ever ran, The Last of Us would have easily been possible on PCs with those old cards at significantly better performance than what the PS3 put out. On a side note, there was also one crucial factor to the PS3 that's a bit of an excluding factor in any PS3-PC comparison since we cannot benchmark it. The SPEs of its cell processor were absolutely vital to the PS3 even being able to keep up with the 360 to some level in multiplatform games. Acclimation to the PS3's bizarre and still-unique architecture, not low-level, performance-increasing optimization, are the reason the PS3 ever managed to even get beyond the poor performance visible in its early titles like GTA IV and Assassin's Creed. In any case, the 8800GTX, being several times more powerful than the GPU specifically in the PS3 would in no way be outclassed by the PS3's performance in any game or performance-measurement.