• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC component performace degradation myth

Updated OP

Inlight of the ongoing The Geforce 750 Ti shouldn't be capable of matching or beating the PS4 GPU thread I decided to perform an experiment. I still have a working 8800GT and my HTPC uses a Sandy Bridge G530 dual core which according to benchmarks puts it in about the same category as Core 2 Duos.

I know the 8800GT is a higher end part compared to the PS3 and Xbox 360 GPUs, but it's all I have and I still think that it's close enough to extrapolate some meaningful data in this context.

There are more than a few people on this board that believe in the promise of coding to the metal philosophy and the oft quoted John Carmack

u37rxN2.png


So my goal is to prove or disprove these beliefs with cold hard facts.

So my test set up is as follows

Intel Celeron G530@2.4GHz
ASUS P8H61-M
4 GB GSKILL DDR3@1333
MSI NX8800GT OC

My first game for now is Tomb Raider. This is all at 720p which is the same as PS360 for this game.

Generic settings
Normal Settings

High Settings

Custom/Ultra


I am blown away by these tests and can't believe how well the 8800GT performs on such a modern game.

Update:

I removed my Steam list, because I will be concentrating on Bioshock Infinite due to it's awesome in engine benchmark. It will allow me to use custom resolutions to account for the custom resolutions on the PS360 and I will down clock my 8800GT to try and best mimick the console hardware. I don't think this is necessary, but for the sake of science I will do it. I don't know how it's going to go, because I can't account fully for things like the EDRAM, but it's easier to match GPUs with the Xbox, because the PS3 has to utilize the cell for some graphics tasks and I can't account for that at all.

To try and clear things up, I'm quoting myself
I see many are still missing the point. This is not a like for like comparison of hardware. Its about peoples belief that some how, magically, current PC hardware will be suffering at the end of the current console generation while consoles will still be going strong. This isn't the case. My 8800GT is faster than old consoles it was then and it is now. Just like today's cards will still be out performing PS4 and XB1 7 years from now.

QtMPuhR.gif


This gif is not meant as dispelling Carmacks quote as false. It is the myth of PC hardware becoming obsolete in comparison to console hardware.
 

epmode

Member
It'd be nice if games included a "console" graphics preset so there wouldn't be so much guessing about the proper settings.
 

Eusis

Member
That's really not a good example, at all. The 8800GT is probably roughly like using the new Geforce 970 GTX against consoles; it's a mid range card with a huge leap over the prior year that was when the new consoles (well, all of them anyway) had debuted. Well, the 8800GT was two years after the 360 as I recall, but Microsoft took a bloodbath to get more cutting edge hardware out there, whereas the PS4/XB1 weren't as aggressive so I think it's safe to assume the curbstomping affordable hardware comes a bit sooner this time.

ANYWAYS, the 9800GT, the 8800GT with a new paintjob, was when I mostly moved to PC for multiplats and reliably got higher resolution (usually 1680x1050) at higher image quality and FPS, so seriously it's a terrible example to use.

EDIT: And I mainly felt a need to upgrade not just because of wanting to run something like Just Cause 2 out and out better than consoles, but because the Witcher 2 basically went "sure, you can use this card... if you want to play at low quality and 20-30 fps!" so you have to remember every so often there may actually be a PC focused title that will kill the PC that others stomps all over consoles.
 

Serandur

Member
Thanks for posting this. Part of this issue is that people often assume false equivalency in game settings between a console and a PC as if resolution, framerate, and the whole scale of potential rendering, lighting, AA, physics, effects, object detail, etc. differences are a trivial consideration when in fact, they are not. One of the major differences, for example, in recent years has been that 1920x1080 or at least 1680x1050 were standard PC resolutions whereas consoles were typically running 1280x720 or lower. That is one such consideration to be made. Even back when the Xbox 360 and PS3 were released, 1280x1024 was more of a standard and naturally more demanding itself.

The truth of the matter is that many performance improvements we saw during the last console generation didn't come from some console-exclusive exploitation of efficiency PCs lack, but from an across-the-board advancement in shader and engine utilization due to emerging technologies (like unified shader architectures in the 360 and 8000 series GPUs onward and multi-core CPUs) and adaptation to newer development styles and this benefited all platforms. Another aspect is, of course, the esoteric nature of console hardware (particularly the PS3) last go-around and improvements with regards to making any efficient use of them. The difference this time around is that there are no such interesting hardware specifications and any engine improvements to take advantage of the updated hardware feature sets will also benefit PCs that have long had a variety of multi-core CPUs, memory, API features, and instruction sets that have gone largely underutilized. DirectX 11 as it is is very efficient in regards to GPU-usage and as illustrated by Mantle, DX12 will soon mitigate CPU draw call overhead as well.

I'm not surprised at all by how the 8800 holds up, there are many examples littering the internet of it holding its own (with reasonable expectations in regards to consoles) with modern games using settings/resolutions demonstrably above what the consoles can achieve (see Battlefield, Crysis 2, etc. videos). The crucial distinction is that ever-greater rendering settings often push PC gamers to upgrade, not necessity of needing to match what are factually the very low standards consoles have to abide by. With the right settings, any piece of hardware is easy to push to its limits, but the inverse is true as well. Old hardware can hold up if you're only expecting what other old hardware (consoles) manage as a baseline, but PCs don't necessarily adhere to those baselines. In any case, the 8800GT was stronger then and it's still stronger now.
 

Abounder

Banned
There are more than a few people on this board that believe in the promise of coding to the metal philosophy and the oft quoted John Carmack

I am blown away by these tests and can't believe how well the 8800GT performs on such a modern game.

Focusing on a single spec or even a family of GPU's does matter some, for example check out Ryse running better on AMD cards and Mantle support.

But yea the last console gen lasted for a long time so it should be no surprise that PC hardware gets a lot of bang for the buck
 

Serandur

Member
That's really not a good example, at all. The 8800GT is probably roughly like using the new Geforce 970 GTX against consoles; it's a mid range card with a huge leap over the prior year that was when the new consoles (well, all of them anyway) had debuted. Well, the 8800GT was two years after the 360 as I recall, but Microsoft took a bloodbath to get more cutting edge hardware out there, whereas the PS4/XB1 weren't as aggressive so I think it's safe to assume the curbstomping affordable hardware comes a bit sooner this time.

ANYWAYS, the 9800GT, the 8800GT with a new paintjob, was when I mostly moved to PC for multiplats and reliably got higher resolution (usually 1680x1050) at higher image quality and FPS, so seriously it's a terrible example to use.

EDIT: And I mainly felt a need to upgrade not just because of wanting to run something like Just Cause 2 out and out better than consoles, but because the Witcher 2 basically went "sure, you can use this card... if you want to play at low quality and 20-30 fps!" so you have to remember every so often there may actually be a PC focused title that will kill the PC that others stomps all over consoles.

Terrible comparison, the 8800GT is at least twice as powerful as the 360 or PS3.

And it's maintaining that at least twice as high lead. It's not a terrible comparison unless you're assuming the 8800 is being compared as a match for the PS3/360, but it's not. It's being compared and proving itself still at least about twice as fast (see the normal benchmarks the OP posted). The stronger hardware more or less retained a similarly-sized advantage. The myth is that old "consoles can do 2x their PC equivalent" nonsense that the OP quoted. The 8800GT is roughly at least about twice as powerful as where the 360's GPU stood. The results are roughly at least twice as fast on Tomb Raider at 1280x720 and normal settings. It's a valid comparison if to prove how the 8800s also benefited from the improvement seen over the past few years in engine advancement. It's a rough comparison given unknown variables we can't obtain information on, but at the very least, it's suggesting that any growing console-exclusive advantages over the years were quite minimal.
 
That's really not a good example, at all. The 8800GT is probably roughly like using the new Geforce 970 GTX against consoles; it's a mid range card with a huge leap over the prior year that was when the new consoles (well, all of them anyway) had debuted. Well, the 8800GT was two years after the 360 as I recall, but Microsoft took a bloodbath to get more cutting edge hardware out there, whereas the PS4/XB1 weren't as aggressive so I think it's safe to assume the curbstomping affordable hardware comes a bit sooner this time.

ANYWAYS, the 9800GT, the 8800GT with a new paintjob, was when I mostly moved to PC for multiplats and reliably got higher resolution (usually 1680x1050) at higher image quality and FPS, so seriously it's a terrible example to use.

EDIT: And I mainly felt a need to upgrade not just because of wanting to run something like Just Cause 2 out and out better than consoles, but because the Witcher 2 basically went "sure, you can use this card... if you want to play at low quality and 20-30 fps!" so you have to remember every so often there may actually be a PC focused title that will kill the PC that others stomps all over consoles.

It's not ? A GPU from 2007 and a CPU that's equivalent to a 2006 CPU outperforming last gen consoles by a large margin is not a good example? According to posts in the 750Ti thread it will be obsolete in 2-4 yrs yet I have provided proof that a 7 yr old CPU/GPU combo out performs consoles that at the time of conception were high end part vs the current consoles which are about 2yrs behind technologically.
 

Kieli

Member
There are more than a few people on this board that believe in the promise of coding to the metal philosophy and the oft quoted John Carmack

Makes me want to puke.

PC can code to the soul of the silicon.

I'd like to see "pedal to the metal" top that.

Edit: /s, in case anyone is wondering.
 

Asmodai48

Member
Terrible comparison, the 8800GT is at least twice as powerful as the 360 or PS3.

What? The whole point is to prove you don't need 2x the power compared to consoles... If the whole "you need 2x the power to equal a consoles performance" was true this gpu would be equivalent to xbox360/ps3 instead it trashes them.
 
Indeed. My last rig (which is 6 years old) can run recent games just fine (mostly at 30 fps on medium settings). It's a mid-end setup overall. I upgraded it earlier this month though.
PC components do degrade though, but not as fast as people claim it to be. I give it 5-7 years.
If you have a monster PC, it'll last as a monster for at least 4 years.
 

Josman

Member
I've got a 750ti paired with a 4670k, it plays games better than a PS4 and it's incredibly silent and power efficient, Tomb Raider and Battlefield 4 were rock solid at 1080p 60fps, with settings higher than consoles, Bioshock Inifnite also ran incredibly well.

The card was supposed to service me while I waited for whatever high end GPU was out by the time Oculus Rift CV came out, but it has surprised me in such a nice way, then again, the CPUs are not even comparable.

But I'm not downplaying the PS4, it's actually impressive what they managed to squeeze in a $400 small box. Xbone is a different sad case though.
 
And it's maintaining that at least twice as high lead.

Well PS3/360 are 2x as powerful as equivalent hardware so using a 8800GT evens the playing field for PC.



That's not how it works. The quote says "consoles run 2x or so better than equal PC hardware." You can't prove anything about that by using a 2x (actually more) powerful system. It's irrelevant.

Just think about it: on some PCs, those games will be completely unplayable. So the 8800GT PC will be infinitely better than the PC with a less GPU, even though the 8800GT itself is not infinitely better than the lesser GPU. Have you just proved that PCs are infinitely better than PCs? No, the method was flawed. For example, let's say the 8800GT is 3 times more powerful than the 7900gs... would that mean that at equivalent settings you will always get 3 times the performance? Or could it be much more?
 

Kieli

Member
Well, the problem is that this advantage of a closed-system (i.e. the consoles) is amplified by the often poorly-optimized ports.

PCs have all this power, and then all we get to use it for is brute-force rendering.

So I can definitely see the same video card performing worse relative to the PS4 and Xbone as the gen progresses.
 

hipbabboom

Huh? What did I say? Did I screw up again? :(
People trying to call out others and looking ignorant in the process. You have no understanding of the "myth" you're trying to bust.
 

Eusis

Member
And it's maintaining that at least twice as high lead. It's not a terrible comparison unless you're assuming the 8800 is being compared as a match for the PS3/360, but it's not. It's being compared and proving itself still at least about twice as fast (see the normal benchmarks the OP posted). The stronger hardware more or less retained a similarly-sized advantage. The myth is that old "consoles can do 2x their PC equivalent" nonsense that the OP quoted. The 8800GT is roughly at least about twice as powerful as where the 360's GPU stood. The results are roughly at least twice as fast on Tomb Raider at 1280x720 and normal settings. It's a valid comparison if to prove how the 8800s also benefited from the improvement seen over the past few years in engine advancement. It's a rough comparison given unknown variables we can't obtain information on, but at the very least, it's suggesting that any growing console-exclusive advantages over the years were quite minimal.
I'd like to get some hard numbers again on the GPUs of the consoles versus the 8800GT, though the 750 ti is also at least in practice roughly in line with the new consoles, not way ahead, and I'd think the real intent of such a thread trying to defend the 750ti would be to find a budget, roughly-console-equal GPU from 2007, not the mid range one people paid about $200-$300+.
It's not ? A GPU from 2007 and a CPU that's equivalent to a 2006 CPU outperforming last gen consoles by a large margin is not a good example? According to posts in the 750Ti thread it will be obsolete in 2-4 yrs yet I have provided proof that a 7 yr old CPU/GPU combo out performs consoles that at the time of conception were high end part vs the current consoles which are about 2yrs behind technologically.
Except you're using a midrange GPU as a stand in for a low range GPU. Now that I look at prices the 760 is probably a better comparison to the 8800 GT rather than the 970... but there's still that underlying message that the 750 ti should be able to keep competing in the long run, when if anything the 8800GT that was an out and out console stomper isn't THAT far ahead in a later gen multiplat game, whereas the 750 Ti's more a console matcher and looks like it could have serious problems keeping on matching later on in the gen.
 
He posed a question. Why be so hostile?

Jeez man, no need to be so harsh on people.

You guys are right. I apologize, but Carmacks quote is so misused that it drives me crazy. No one really knows what the context of the quote is. The most popular interpretation is that he is referring to CPU utilization, which could be true, but has nothing to do with my benchmarks.
 
This thread has a strange nastiness about it.

I never played TR but is it safe to assume it didn't look as good as (e.g.) Uncharted 3? I think Loofy's point about being able to get more out of a platform-specific game as compared to a multiplatform game is quite relevant.

You guys are right. I apologize, but Carmacks quote is so misused that it drives me crazy. No one really knows what the context of the quote is. The most popular interpretation is that he is referring to CPU utilization, which could be true, but has nothing to do with my benchmarks.
If you care so much about providing context why don't you include the tweet he's replying to for clarity? It would be nice to know exactly what was asked.
 

Naminator

Banned
You can try Alan Wake to see if any of that 1st party vs 3rd party stuff holds any weight.

I know that Sleeping Dogs, Batman, and Bioshock Infinite have benchmarks, so maybe try those?

And since you have DS and DS2 installed already, care to tell us how those run?
 

danwarb

Member
People trying to call out others and looking ignorant in the process. You have no understanding of the "myth" you're trying to bust.

The Carmack quote is ambiguous without more context, but the literal interpretation of it, while ridiculous, is effectively busted there.
 

ISee

Member
Someone should build a gtx 750ti system with an i3 4310 and test all multiplat games on estimated console settings (past and future). Something like "The gtx 750ti, i3 4130 OT, let's finaly test it"
 
The Carmack quote is ambiguous without more context, but the literal interpretation of it, while ridiculous, is effectively busted there.


The OP is not even addressing the quote. To use some totally made up numbers:

console power 1, performance 2
pc power 1, performance 1

console power 2, performance 8
pc power 2, performance 4

This is just to show that, logically, the OP's test is irrelevant to the quote. It is perfectly possible for a pc with double the power to have double the performance, AND a console with equal power to have double the performance. As long as the console performance increases faster with greater power.
 

belmonkey

Member
I'd love to see how something like a GT 8600 or GT 9500 (GDDR3 versions, not GDDR2) hold up; I think I read that they're similar in performance to the 7800, although with DX10 support (and not completely antiquated like the 7800). Speaking of the 7800, is it really the right match for those consoles? The PS3 had the cell doing extra graphics work while the 360 had an ATI GPU with double the shaders.
 
This thread has a strange nastiness about it.

I never played TR but is it safe to assume it didn't look as good as (e.g.) Uncharted 3? I think Loofy's point about being able to get more out of a platform-specific game as compared to a multiplatform game is quite relevant.


If you care so much about providing context why don't you include the tweet he's replying to for clarity? It would be nice to know exactly what was asked.

I have a PS3, have played UC1, UC2 and UC3. I think Tomb Raider looks better than all of them. I also have TLOU and love it, but I think Tomb Raider looks better.
 

belmonkey

Member
Someone should build a gtx 750ti system with an i3 4310 and test all multiplat games on estimated console settings (past and future). Something like "The gtx 750ti, i3 4130 OT, let's finaly test it"

Since I'm not too picky beyond playing at native res, I'm toying with the idea of getting that kind of system and also using it to do comparison benchmarks. There was one guy taking requests earlier though with a 750 ti system.
 
That's really not a good example, at all. The 8800GT is probably roughly like using the new Geforce 970 GTX against consoles; it's a mid range card with a huge leap over the prior year that was when the new consoles (well, all of them anyway) had debuted. Well, the 8800GT was two years after the 360 as I recall, but Microsoft took a bloodbath to get more cutting edge hardware out there, whereas the PS4/XB1 weren't as aggressive so I think it's safe to assume the curbstomping affordable hardware comes a bit sooner this time.

ANYWAYS, the 9800GT, the 8800GT with a new paintjob, was when I mostly moved to PC for multiplats and reliably got higher resolution (usually 1680x1050) at higher image quality and FPS, so seriously it's a terrible example to use.

EDIT: And I mainly felt a need to upgrade not just because of wanting to run something like Just Cause 2 out and out better than consoles, but because the Witcher 2 basically went "sure, you can use this card... if you want to play at low quality and 20-30 fps!" so you have to remember every so often there may actually be a PC focused title that will kill the PC that others stomps all over consoles.

I'd have to agree: http://www.techpowerup.com/gpudb/201/geforce-8800-gt.html

That card stomps on the Xbox 360 and PS3.
 
You can try Alan Wake to see if any of that 1st party vs 3rd party stuff holds any weight.

I know that Sleeping Dogs, Batman, and Bioshock Infinite have benchmarks, so maybe try those?

And since you have DS and DS2 installed already, care to tell us how those run?

Will do. these games aren't installed on my 8800GT machine yet though, just my main gaming PC.
 

Denton

Member
The context of that tweet was someone asking Carmack :

Can you please give us a tweet of truth about this AMD "let's drop directx" babble?

And Carmack responded with that tweet.

Which as both OP and various Digital Foundry test show, is wildly inaccurate, for multiplatform games at least.
 

Dead Man

Member
You have a contribution or are you just going to selectively quote me?

'Oh no, somebody quoted me but not every post I have ever made.' Pfft. You waded in with the angst mate, deal with people calling you on it. When your reply to another poster is 'Enlighten us' you have fuck all ground to stand on to call for other people to contribute.

I don't have a great technical understanding so I am reading the thread (which is interesting and you have a good OP), not parading around aggression.

Have fun.
 
'Oh no, somebody quoted me but not every post I have ever made.' Pfft. Yo9u waded in with the angst mate, deal with people calling you on it. When your reply to another poster is 'Enlighten us' you have fuck all ground to stand on to call for other people to contribute.

I don't have a great technical understanding so I am reading the thread, not parading around aggression.

Have fun.

I'm the OP and I provided 3 Benchmarks on a previous gen game with 2007 equivalent PC hardware that has withstood the test of time, that others have said could not happen with the current gen. Your sole contribution to this thread is quoting me to try and paint me as an aggressor. Please address the topic at hand or leave.
 
Top Bottom