• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Can a £100 PC graphics card match next-gen console?

lets say it this way.. could the highest end Graphics Card of 2006 when the PS3 came out play Last of Us at that level of quality?? (when Oblivion and F.E.A.R where good looking games pushing PC's)

i doubt it.. there is alot working in favor of consoles when it comes to optimization. right now high end graphic cards are better and will easily outperform any currently released next gen console game.. but as console games keep being optimized the same PC graphic cards of that time would have trouble getting the same optimization from developers.

Considering the time lines of last gen there was huge jump in graphics 2k4-2k7. You mention a game that comes out last year and has the benefit of dev experience with working a single architecture over time to distort your actual point even more.

Also stop with optimization before I spam a certain post on the subject it's a useless talking point for people of a certain crowd. PC gamers especially those who buy in bits get products over time. Very few go an entire console generation with one card steam surveys clearly show shifts over time, the same can be said for sales of amd and nvidia products.
 
lets say it this way.. could the highest end Graphics Card of 2006 when the PS3 came out play Last of Us at that level of quality?? (when Oblivion and F.E.A.R where good looking games pushing PC's)

i doubt it.. there is alot working in favor of consoles when it comes to optimization. right now high end graphic cards are better and will easily outperform any currently released next gen console game.. but as console games keep being optimized the same PC graphic cards of that time would have trouble getting the same optimization from developers.

Yes, it could.
http://www.youtube.com/watch?v=jHWPGmf_A_0

And 8800GTS, which was the highest end GPU back then, would run Last of Us in 1080p.
How do I know? Look at this Crysis 2 benchmark:

Gamer-1900x1200.png

http://www.geforce.com/whats-new/guides/crysis-2-benchmarks#3

GTS is a little faster than GT.
 
lets say it this way.. could the highest end Graphics Card of 2006 when the PS3 came out play Last of Us at that level of quality?? (when Oblivion and F.E.A.R where good looking games pushing PC's)

i doubt it.. there is alot working in favor of consoles when it comes to optimization. right now high end graphic cards are better and will easily outperform any currently released next gen console game.. but as console games keep being optimized the same PC graphic cards of that time would have trouble getting the same optimization from developers.

lol don't forget PS3 cost $600 at launch so that gives some wiggle room. but yea TLOU at 720p medium settings 24-30fps wouldn't require much nowadays. Previous gen dragged on too long anyways so its not much of a fair comparison.
 
lets say it this way.. could the highest end Graphics Card of 2006 when the PS3 came out play Last of Us at that level of quality?? (when Oblivion and F.E.A.R where good looking games pushing PC's)

i doubt it.. there is alot working in favor of consoles when it comes to optimization. right now high end graphic cards are better and will easily outperform any currently released next gen console game.. but as console games keep being optimized the same PC graphic cards of that time would have trouble getting the same optimization from developers.

1671_p11_3.jpg


In contrast, the PS3 version of Oblivion was running at 1280x720, ~25 fps with HDR and no AA. Oblivion was pushing the PS3 pretty hard, not so much then-high-end cards like the 8800GTX and x1950XTX except for the fact that PC gaming was already moving over to higher resolution standards and therefore required stronger video cards. This cannot be emphasized enough as it is far too often left out of these comparisons: 1280x720 was never a standard resolution for PC gaming. 1280x1024 was common at the time as was 1600x1200. The move to widescreen PC gaming, even back in 2006, was pretty much immediately to 1400x900/1680x1050/1920x1080/1920x1200. Gaming PCs commanded a resolution advantage pretty much the entirety of the last console generation, with the usual higher demands in framerate and AA (another thing the PS3 sparsely did in any capacity before the advent of post-processing AA solutions). Naturally, PC gamers, as we generally do, gravitated to higher-power graphical solutions, especially relatively revolutionary ones. When the G80 (8800 cards) started to have very affordable variants in 2007 with huge and immediately noticeable performance benefits (especially with Crysis, higher resolutions, DirectX 10 features, and the popularization of unified shader architectures, not to mention the introduction of CUDA on Nvidia's side) and beyond, it was pretty much a no-brainer to move on to stronger and affordable new tech. Crysis, even after it was recreated for the consoles only a couple years ago with the latest CryEngine never ran anywhere near as well as variants of the 8800 managed the game and the 8800GTX (as well as the 8800GT) ran Battlefield 3, Crysis 2, and every other recent multiplatform people still cared to run on the thing far better than either console.

Guesswork and doubting based on pretty much nothing, let alone an understanding of what kind of advancements PC GPU technology were making at the time, is not a quantifiable, sound method of determining what a given GPU can or cannot do in a given environment. I guarantee you, like everything else on both consoles and PCs that 8800 GTX/GTs ever ran, The Last of Us would have easily been possible on PCs with those old cards at significantly better performance than what the PS3 put out. On a side note, there was also one crucial factor to the PS3 that's a bit of an excluding factor in any PS3-PC comparison since we cannot benchmark it. The SPEs of its cell processor were absolutely vital to the PS3 even being able to keep up with the 360 to some level in multiplatform games. Acclimation to the PS3's bizarre and still-unique architecture, not low-level, performance-increasing optimization, are the reason the PS3 ever managed to even get beyond the poor performance visible in its early titles like GTA IV and Assassin's Creed. In any case, the 8800GTX, being several times more powerful than the GPU specifically in the PS3 would in no way be outclassed by the PS3's performance in any game or performance-measurement.
 
Of course it is, I was weighing in on the question which had just been asked; could a high end GPU from 2006 play TLOU.
Which is why I think if you want a PC to comfortably survive a console generation, get a GPU that is much stronger than what is in the consoles. e.g. AMD 7970.
 
Which is why I think if you want a PC to comfortably survive a console generation, get a GPU that is much stronger than what is in the consoles. e.g. AMD 7970.

Sure, I agree with that. I've said a couple of times already that I think the comparison made by DF is an interesting discussion point and not much more.
 
Which is why I think if you want a PC to comfortably survive a console generation, get a GPU that is much stronger than what is in the consoles. e.g. AMD 7970.

The two generations are different. Last generation, the consoles were matching top PC performance at their introduction (roughly). This generation, they aren't even close.
 
All this talk about the 8800, man, such an awesome card.

I'd wager its still one of the best GPU' ever.

off topic:

congrats to Jim-jam, by the looks of it you have a new family member.
 
Which is why I think if you want a PC to comfortably survive a console generation, get a GPU that is much stronger than what is in the consoles. e.g. AMD 7970.

Nah, its overkill for console alike settings, but generally its very good choice.
Lets not forget that 8800 GTS not only played games in console settings, but it outperformed them by factor of two or more.
 
Which is why I think if you want a PC to comfortably survive a console generation, get a GPU that is much stronger than what is in the consoles. e.g. AMD 7970.

I thought it's been fairly well established that this line of thinking is no longer relevant like last generation. The consoles are essentially PCs in a box, more so now than ever before. And they're under-powered compared to previous console generations in the scheme of things. A mid-end card now is much more likely to keep up for years to come, for various reasons. The landscape has changed; drastically even.
 
People need to realize that the PC overhead from the OS and drivers is mostly put on the CPU. And the new consoles have pathetic laptop CPUs. Any decent desktop CPU will be able to handle that overhead.
 
I thought it's been fairly well established that this line of thinking is no longer relevant like last generation. The consoles are essentially PCs in a box, more so now than ever before. And they're under-powered compared to previous console generations in the scheme of things. A mid-end card now is much more likely to keep up for years to come, for various reasons. The landscape has changed; drastically even.
Hardly. A mid-end card will quickly become irrelevant. My 6950 didn't even guarantee me 1080p60 for every single game 'last gen'. Now try playing COD Ghosts or AC Black Flags.

Now my PS4 is suddenly a slightly more capable gaming system than what I have.

It's still the same - get a PC GPU which is stronger than the console. I did the same when I had a PS3 and 360 (8800 GTS which I upgraded to a 6950 for 1080p60), and I will do the same when gen 2 Mantle cards get released.
 
This is complete lie. My 6950 didn't even guarantee me 1080p60 for every single game 'last gen'. Now try playing COD Ghosts or AC Black Flags.

This is an incorrect comparison. Last gen consoles weren't pushing 1080p (ever) or 60 hz (mostly) for any of those games. The reason you need better PC hardware is because you have higher standards for the PC experience apparently.
 
Hardly. A mid-end card will quickly become irrelevant. My 6950 didn't even guarantee me 1080p60 for every single game 'last gen'. Now try playing COD Ghosts or AC Black Flags.

Now my PS4 is suddenly a slightly more capable gaming system than what I have.

It's still the same - get a PC GPU which is stronger than the console. I did the same when I had a PS3 and 360 (8800 GTS which I upgraded to a 6950 for 1080p60), and I will do the same when gen 2 Mantle cards get released.

Why would 6950 give You 1080p60 on every last gen game? And what settings did You test it on?
 
Top Bottom