• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC component performace degradation myth

Eusis

Member
I have a PS3, have played UC1, UC2 and UC3. I think Tomb Raider looks better than all of them. I also have TLOU and love it, but I think Tomb Raider looks better.
Incidentally, you'd probably have established your point if instead you were countering people saying that, say, a specialized PS4 game would look better than anything the 760 or 970 could put out. Which I'm reasonably sure isn't going to happen even when working with DirectX rather than something closer to the metal, not that you couldn't be impressed by what the PS4/XB1 put out later on in the generation. Kind of like how you could move on to the SNES or PS2 and still be impressed by something that comes to the NES or PS1!
 

Dead Man

Member
I'm the OP and I provided 3 Benchmarks on a previous gen game with 2007 equivalent PC hardware that has withstood the test of time, that others have said could not happen with the current gen. Your sole contribution to this thread is quoting me to try and paint me as an aggressor.

I know that, hence my edit you seem to have missed. But tell me some more about my sole contribution to this thread as if I can't comment on commentary. :/
 

Denton

Member
'Oh no, somebody quoted me but not every post I have ever made.' Pfft. You waded in with the angst mate, deal with people calling you on it. When your reply to another poster is 'Enlighten us' you have fuck all ground to stand on to call for other people to contribute.

I don't have a great technical understanding so I am reading the thread (which is interesting and you have a good OP), not parading around aggression.

Have fun.
The enlighten us post was appropriate considering that Hippababoom just insulted the OP without providing any argument or substance. Hell, he was too nice to him if anything.
 

Serandur

Member
That's not how it works. The quote says "consoles run 2x or so better than equal PC hardware." You can't prove anything about that by using a 2x (actually more) powerful system. It's irrelevant.

Just think about it: on some PCs, those games will be completely unplayable. So the 8800GT PC will be infinitely better than the PC with a less GPU, even though the 8800GT itself is not infinitely better than the lesser GPU. Have you just proved that PCs are infinitely better than PCs? No, the method was flawed. For example, let's say the 8800GT is 3 times more powerful than the 7900gs... would that mean that at equivalent settings you will always get 3 times the performance? Or could it be much more?
What? Processors function via math. Relative to each other, they're comparable in terms of percent or multiples, to generalize based on the average tendencies of said hardware in a given software application. In fact, there are multiple parts of a processor that may be more or less stressed by rendering certain things, but they still tend to perform roughly the same relative to one another in lieu of an external limitation in similar programs. 2x faster means 2x faster and if it doesn't in Carmack's quote, then it is the quote that is irrelevant or highly taken out of context, not the processor that proves the 2x figure is completely unsupported.

Processors function by math, not the odd conjecture of people wanting to believe in magic, and if the Carmack quote were taken literally, he would be saying literally at least half the cycles of any processor (CPU and GPU) in a PC, half of all memory capacity, and half of all throughput are completely wasted, which is complete and laughably-generalized BS that cannot possibly be true. No results confirm that is close to the case, no engineer would overlook whatever causes such a ridiculous flaw, and if the quote is then not taken literally and is some random estimate that was thrown out there of how well developers can limit and hide things selectively in a game designed purely for that one platform to maximize perceived value, it is not mathematical, factual, or quantifiable and therefore is irrelevant.

I'd like to get some hard numbers again on the GPUs of the consoles versus the 8800GT, though the 750 ti is also at least in practice roughly in line with the new consoles, not way ahead, and I'd think the real intent of such a thread trying to defend the 750ti would be to find a budget, roughly-console-equal GPU from 2007, not the mid range one people paid about $200-$300+.

Except you're using a midrange GPU as a stand in for a low range GPU. Now that I look at prices the 760 is probably a better comparison to the 8800 GT rather than the 970... but there's still that underlying message that the 750 ti should be able to keep competing in the long run, when if anything the 8800GT that was an out and out console stomper isn't THAT far ahead in a later gen multiplat game, whereas the 750 Ti's more a console matcher and looks like it could have serious problems keeping on matching later on in the gen.

The 360's GPU was a custom design (the first with unified shaders ever, not an insignificant detail) somewhere in between an x1800 and an x1900 if I recall correctly with it's own unique improvements. The PS3's was a cut-down 7800 chip that struggled immensely relative to the 360's GPU and whose performance ultimately is obfuscated by the impact of the PS3's Cell processor. It's not quite so easy to compare directly given the uniqueness of both setups at the time relative to PCs, but the 8600GT is more of a match if you're looking for something on the same level. Tomb Raider no doubt runs at 30 FPS or less on both platforms with minimalistic settings, FXAA at best, and probably little to no AF. In any case, the OP's intent regarding the 750Ti might be off-based, but the fact of the 8800's sustained potency after all its years is what I care about as a demonstration of the OP's title, not equating it to a 750Ti.
 
So I'm currently downloading Sleeping Dogs to the 8800GT machine and that's a 45 min process. I will post results when it's done.

I just want to reiterate that I am aware of the fact that the 8800GT is a more advanced part then what is in the PS360, but the fact that it performs as well as it does dispels the "console optimization" myth.


Dunno, but my GTX 770 (2gb) already not sufficient for newest games.

It certainly outperforms PS4/XB1. My 7870 does by a good margin.
 

DarkoMaledictus

Tier Whore
3204840swsw.gif
 

Denton

Member
Dunno, but my GTX 770 (2gb) already not sufficient for newest games.
It is at console equivalent settings, resolution and framerate and higher, as long as you stay within VRam limit.
One of the reasons why I went with 3GB AMD gpu last year. And it was a good decision, seeing how much VRam Watchdogs, Lords or Unity eats up.
 

Knurek

Member
So I'm currently downloading Sleeping Dogs to the 8800GT machine and that's a 45 min process. I will post results when it's done.

You are aware you're gonna get slammed by the console warriors that you're using single publisher's output?
Of course afterwards you're gonna get slammed by not testing Uncharted or Infamous or Forza.
Keep up the good fight.
 
It is at console equivalent settings, resolution and framerate, as long as you stay within VRam limit.
One of the reasons why I went with 3GB AMD gpu last year. And it was a good decision, seeing how much VRam Watchdogs, Lords or Unity eats up.

I regret getting a 7870XT 2GB vs a 7950 3GB for $50 more at time, especially since a few months after I bought my card the 7950 dropped below $250 for a bit.

You are aware you're gonna get slammed by the console warriors that you're using single publisher's output?
Of course afterwards you're gonna get slammed by not testing Uncharted or Infamous or Forza.
Keep up the good fight.
I will do Alan Wake eventually, but that game seems like cheating. I don't think it stresses hardware at all. 30 mins left on Sleeping Dogs
 
What? Processors function via math. Relative to each other, they're comparable in terms of percent or multiples, to generalize based on the average tendencies of said hardware in a given software application. In fact, there are multiple parts of a processor that may be more or less stressed by rendering certain things, but they still tend to perform roughly the same relative to one another in lieu of an external limitation in similar programs. 2x faster means 2x faster and if it doesn't in Carmack's quote, then it is the quote that is irrelevant or highly taken out of context, not the processor that proves the 2x figure is completely unsupported.


Everything you just said is irrelevant to the OPs test. You are speaking of comparable parts. OP is not.


I just want to reiterate that I am aware of the fact that the 8800GT is a more advanced part then what is in the PS360, but the fact that it performs as well as it does dispels the "console optimization" myth.


The only thing that bothers me in this thread is the faulty logic. It's like someone says "monkeys like bananas", and you "bust" that myth by showing that monkeys have tails. Maybe monkey don't like bananas, but you sure didn't prove that by showing a picture of a tail.

Console power 1, performance 2
PC power 1, performance 1

Console power 2, performance 8
PC power 2, performance 4

These are made up numbers, but they prove that you are using faulty logic.

These numbers are consistent with Carmack's claim and with your test. Since your test agrees with Carmack's claim, it cannot "bust" it (even if the claim is indeed false).
 
Everything you just said is irrelevant to the OPs test. You are speaking of comparable parts. OP is not.





The only thing that bothers me in this thread is the faulty logic. It's like someone says "monkeys like bananas", and you "bust" that myth by showing that monkeys have tails. Maybe monkey don't like bananas, but you sure didn't prove that by showing a picture of a tail.

Console power 1, performance 2
PC power 1, performance 1

Console power 2, performance 8
PC power 2, performance 4

These are made up numbers, but they prove that you are using faulty logic.

These numbers are consistent with Carmack's claim and with your test. Since your test agrees with Carmack's claim, it cannot "bust" it (even if the claim is indeed false).

Please post "real" numbers
 
Good intentions OP, couple of things to note.

1. 8800GT memory bandwidth is almost 3 times of RSX and Xenos.
2. CPU you used is almost twice as fast as the CPU used in test setups of that time.
3. RAM you used is twice the size as the size used in test setups of that time.

Flawed comparison but props for effort.

I'm not comparing my setup to equivalent PCs of the time, I'm trying to use the most equivalent hardware at my disposal to PS360. My setup outperforms PS360, when according to some PS360 should be ahead. My point isn't to say PC equivalent hardware is better than console equivalent hardware. It's to show that 7+ year old PC hardware still performs at or above console hardware.
 
Someone would need hardware from the release time frame. A high-end late 2005 PC, for instance, not an 8800GT.

PS3 was released late 2006.

My CPU is comparable to the Q6600 although less powerful released Jan 2007.

I already admitted my 8800GT was beyond the PS360 GPUs, but still it's a 2007 GPU that outperforms PS360, which others have said shouldn't happen.
Current PC technology is 2-3 yrs ahead of the PS4/XB1 and yet some want to believe PC hardware won't hold up due to "optimizations" which is bullshit.

Yes, but i hit vram limit in many games, even for 1080p (3gb for Ryse, 4 for Unity, 6(!) for Mordor)

3GB is not required as benchmarks show.

ryse_1080p_high.png
 

dark10x

Digital Foundry pixel pusher
PS3 was released late 2006.

My CPU is comparable to the Q6600 although less powerful released Jan 2007.

I already admitted my 8800GT was beyond the PS360 GPUs, but still it's a 2007 GPU that outperforms PS360, which others have said shouldn't happen.
Current PC technology is 2-3 yrs ahead of the PS4/XB1 and yet some want to believe PC hardware won't hold up due to "optimizations" which is bullshit.

3GB is not required as benchmarks show.
Xbox 360 was released in 2005.

Again, your idea is sound, but you're ruining it by using hardware that doesn't actually provide the information you're trying to present. An 8800GT is WAY beyond those consoles. Even though you've made that clear that doesn't mean your results are accurate.

3GB is not required as benchmarks show.
It's not but Ryse will not allow users to use the highest detail texture setting without 3gb card.
 

coastel

Member
I'm not comparing my setup to equivalent PCs of the time, I'm trying to use the most equivalent hardware at my disposal to PS360. My setup outperforms PS360, when according to some PS360 should be ahead. My point isn't to say PC equivalent hardware is better than console equivalent hardware. It's to show that 7+ year old PC hardware still performs at or above console hardware.


Wow so basically better hardware that in some way is 3 time's better than ps360 will perform close or better. Got ya.
If people are saying hardware wont keep in in 2 or 3 years like the 750ti I would think they mean the consoles will be more optimised then but the 750 ti wont lose no performance of cause as that's dumb why would it.
 
Xbox 360 was released in 2005.

Again, your idea is sound, but you're ruining it by using hardware that doesn't actually provide the information you're trying to present. An 8800GT is WAY beyond those consoles. Even though you've made that clear that doesn't mean your results are accurate.


It's not but Ryse will not allow users to use the highest detail texture setting without 3gb card.

I mean all you have to do is just logically step down and say a 1950xt 1950 or 16800 Ultra would just perform worse.

The 8800 is running at above console settings at higher fps... I see now reasonw hy the lower end GPUs could not run at nearer console settings at similar fps.
Wow so basically better hardware that in some way is 3 time's better than ps360 will perform close or better. Got ya.

It runs much better than the consoles, because it is better. Now extrapolate that performance to lower end hardware (which scales linearly). And wow!
 
Xbox 360 was released in 2005.

Again, your idea is sound, but you're ruining it by using hardware that doesn't actually provide the information you're trying to present. An 8800GT is WAY beyond those consoles. Even though you've made that clear that doesn't mean your results are accurate.


It's not but Ryse will not allow users to use the highest detail texture setting without 3gb card.

Release date doesn't mean anything. Equivalent power does.

CPU is comparable.

According to Wikipedia the Xenos has a higher GFLOp than the 8800GT and due to the eDRAM it higher bandwidth.

Look you know and I know that my system is more powerful, but not powerful enough to overcome a 2X advantage of "theoretical" console optimization. That is all I'm saying.
 

coastel

Member
I mean all you have to do is just logically step down and say a 1950xt 1950 or 16800 Ultra would just perform worse.

The 8800 is running at above console settings at higher fps... I see now reasonw hy the lower end GPUs could not run at nearer console settings at similar fps.


It runs much better than the consoles, because it is better. Now extrapolate that performance to lower end hardware (which scales linearly). And wow!

So we need to see lower end hardware from that era that is on par or close to ps360. By the way I have always said the 2x performance is bullshit.
 

Mohasus

Member
I still have an 8600GT, but I don't have an old system to test. So people will complain saying that it is unfair because I have way more ram than I would in 2006 or something. Seeing how most people are missing OP's point, I don't think I'll bother with it.

OP is just trying to show that a system that was better than a console in 2007 was better than a last-gen console during the whole gen.
 

wachie

Member
I'm not comparing my setup to equivalent PCs of the time, I'm trying to use the most equivalent hardware at my disposal to PS360. My setup outperforms PS360, when according to some PS360 should be ahead. My point isn't to say PC equivalent hardware is better than console equivalent hardware. It's to show that 7+ year old PC hardware still performs at or above console hardware.
Makes no sense, like dark10x points out as well.
 
I still have an 8600GT, but I don't have an old system to test. So people will complain saying that it is unfair because I have way more ram than I would in 2006 or something.

Seeing how most people are missing OP's point, I don't think I'll bother with it.

Turn off those "extra" CPU cores in bios, pop in the 8600GT, downclock to like .9 Ghz.

Don't worry, your computer wont explode.
---------------------------------------------------------------

Actually, what CPU do you have? I think this lends to an interesting way to do the experiment.
 

Serandur

Member
Everything you just said is irrelevant to the OPs test. You are speaking of comparable parts. OP is not.





The only thing that bothers me in this thread is the faulty logic. It's like someone says "monkeys like bananas", and you "bust" that myth by showing that monkeys have tails. Maybe monkey don't like bananas, but you sure didn't prove that by showing a picture of a tail.

Console power 1, performance 2
PC power 1, performance 1

Console power 2, performance 8
PC power 2, performance 4

These are made up numbers, but they prove that you are using faulty logic.

These numbers are consistent with Carmack's claim and with your test. Since your test agrees with Carmack's claim, it cannot "bust" it (even if the claim is indeed false).
Your logic is making no mathenatical sense. No, what I said is relevant to processors in general. 2x is BS and the OP's test shows it. The 8800GT is about 2x the 360's closest known equivalent PC GPU and at the reasonably comparable settings in the OP's test, the 8800GT isn't dipping even once below 54 FPS, about double the minimum of how the console versions run, if not a bit more. If you can do "2x" more with a given processor in one instance, then the processor that's acknowledged as twice as fast in the other instance would be equally as slow.

If you state you can do 2x more in any given context, you're directly implying that half of the processor's capabilities are wasted in another (which as I've said is a load of imsulting and laughably-generalized BS). Assume a 360's GPU is represented by 100% and the 8800GT by, say, 250% to be generous in demonstrating the extent of the point. If the 360's GPU by virtue of being in a console can do 2x as the same GPU in a PC, it would then be represented by 200% vs the 8800's 250% or, more accurately according to the claim's implications, the 8800 would be reduced to 125% vs the 360's 100% which could only possibly yield FPS 25% more relative to whatever the 360 GPU is doing (in this case, about 25 FPS is a safe minimum FPS bet with 125% of that being a mere 30 FPS as opposed to the 54 the OP demonstrated). It's basic math.
 

bj00rn_

Banned
Makes no sense, like dark10x points out as well.

IWith extrapolation his gpu seems to be within the parameters of what he's trying to prove/disprove. Whether he's right or not I'm not going to judge, my point is only that it's not exactly hard to understand his theory if you just tried a little harder..
 

King_Moc

Banned
Probably worth bearing in mind that we don't know what frame rate the 360 or PS3 were actually capable of achieving with Tomb Raider. It was locked to 30.
 
According to Wikipedia the Xenos has a higher GFLOp than the 8800GT and due to the eDRAM it higher bandwidth.

Xenos: 240 GFLOP/s, 32 GB/s to EDRAM
8800 GT: 504 GFLOP/s, 57.6 GB/s (while also having a more modern feature set and probably being more efficient)


I think one shouldn't take that famous Carmack quote too literally, it's just a single line without context really. If you'd ask him to explain and he had enough time to do so you should get a much more sophisticated answer.
2x may be the difference that can happen, for some parts of game performance (probably mostly on the CPU side) and if the game is thoroughly optimized for one console architecture. That's probably not the case for many multi-platform games.
Still, selected Mantle benchmarks show for example that the API alone can boost the performance by 2x in CPU-limited, draw-call heavy scenarios.
 

Sinfamy

Member
What I would also like to add is that current drivers are a lot less optimized for old GPU's and someones you can benefit from using older drivers.
 

Riky

$MSFT
The last gen consoles major bottleneck was Ram in the end, I've seen many of these PC comparisons over the years and the fact they use several times the system Ram and a chunk of Vram on top is often glossed over. How would a PC with only 512mb of total Ram cope?
 

spekkeh

Banned
The last gen consoles major bottleneck was Ram in the end, I've seen many of these PC comparisons over the years and the fact they use several times the system Ram and a chunk of Vram on top is often glossed over. How would a PC with only 512mb of total Ram cope?

Not even sure you could run Vista with 256MB system RAM.
 

wachie

Member
IWith extrapolation his gpu seems to be within the parameters of what he's trying to prove/disprove. Whether he's right or not I'm not going to judge, my point is only that it's not exactly hard to understand his theory if you just tried a little harder..
He's using 3x of a metric of the previous gen, 2x the metric of another and possibly baselining the rest.

You dont see the flaw?
 

virtualS

Member
OP, I think a better comparison would be to run benchmarks on a PC that is as close as possible in terms of raw horsepower and TDP to the PS3/360 with all in game settings identical to those found on console and then compare. This is no easy thing to achieve, probably impossible considering the custom nature of console processors last gen. I still have a 7800 GS and an Athlon 64x2... all from 2006, a 150W to 200W system when gaming and perhaps equivalent to a PS3.

You'll never see that system produce anything like The Last of Us. Who knows, maybe it could in the right hands if games were written for it specifically and Windows was not needed.
 

cpooi

Member
I honestly think that in terms of fps, there isn't much of a difference between pc and console with similar GPU power, if the game in question is GPU bound, and assuming good coding. Games like Resident Evil 5 scale very nicely to GPU power.

On the other hand, if you include frame times, I think that is where PC games have an issue, due to drivers and stuff. (This has improved over the years though.) Benchmarks' min fps doesn't use the longest frame time to calculate it, so the long frame time issue is hidden.

Frame time does affect gameplay and perceptual smoothness significantly though.
 

DonMigs85

Member
Anybody still have a GeForce 7800 GT lying around? Lol
I'd also like to request a run of the T-rex test on Gfxbench using older flagship cards.
 

bj00rn_

Banned
He's using 3x of a metric of the previous gen, 2x the metric of another and possibly baselining the rest.

You dont see the flaw?

A flaw is different from something nonsensical. At least it should mean there is something to discuss without having to assassinate the main theory. F.ex. we all know the 2x quote from Carmack is a red herring in the form and context it's usually presented. The op's theory makes certain sense in that (and longevity) respect regardless of flaws.
 
OP, I think a better comparison would be to run benchmarks on a PC that is as close as possible in terms of raw horsepower and TDP to the PS3/360 with all in game settings identical to those found on console and then compare. This is no easy thing to achieve, probably impossible considering the custom nature of console processors last gen. I still have a 7800 GS and an Athlon 64x2... all from 2006, a 150W to 200W system when gaming and perhaps equivalent to a PS3.

You'll never see that system produce anything like The Last of Us. Who knows, maybe it could in the right hands if games were written for it specifically and Windows was not needed.

The point of this thread wasn't really to say "This PC hardware that is identical in performance to console hardware is better". It was to say that PC hardware from 7 years ago is still viable. There will never be a 1:1 comparison of PC hardware to console hardware, but PC hardware doesn't suddenly become less viable than equivalent console hardware. If you look at my Tomb Raider benchmarks I'm easily scoring twice the performance of the PS360. Even if the 8800GT is double the performance, this only proves my point.
 
The point of this thread wasn't really to say "This PC hardware that is identical in performance to console hardware is better". It was to say that PC hardware from 7 years ago is still viable. There will never be a 1:1 comparison of PC hardware to console hardware, but PC hardware doesn't suddenly become less viable than equivalent console hardware. If you look at my Tomb Raider benchmarks I'm easily scoring twice the performance of the PS360. Even if the 8800GT is double the performance, this only proves my point.
You should put this in the OP. A lot of people in here don't seem to understand.

It's certainly a common belief that console hardware lasts longer than PC hardware when it comes to games. I don't know if that's true or not but I think this is a very interesting test.
 
The point of this thread wasn't really to say "This PC hardware that is identical in performance to console hardware is better". It was to say that PC hardware from 7 years ago is still viable.

Then why did you bring up the Carmack quote? It's only related to the former, not the latter.
 

wachie

Member
A flaw is different from something nonsensical. At least it should mean there is something to discuss without having to assassinate the main theory. We all know the 2x quote from Carmack is a red herring in the form and context it's usually presented. The op's theory makes certain sense in that respect regardless of flaws.
I didnt say his comparison was nonsense, it's just flawed. This comparison provides as good a datapoint then any other old PC that has the performance metrics all over the place when baselined against the previous gen consoles.
 
Top Bottom