• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC component performace degradation myth

Very interesting benchmarks OP, thanks for putting in the effort! It is pretty amazing how a midrange card that launched for $200-250 can still kick console butt even today, seven years after its release. It really puts the performance degradation myth to rest.
 

IcyEyes

Member
I can't even know where to start, but well, the myth is not busted, sorry.

Don't get me wrong, It's a nice try and worth some attention !
 

pestul

Member
God how I wish Uncharted 4 would see a good PC port to finally end this discussion. Of course it would be ignored because it would just be another multiplat.
 
So, I'm a noob when it comes to stuff like this. However, would somebody explain a few things to me. Console games task the GPU to do the heavy lifting, right? If so, are PC games/ports similarly designed or is it fundamentally different - aka does the PC version task the CPU to do more? Also, wouldn't having roughly 3.5 GB's more RAM help the PC?

I'm just trying to put the comparison into context as somebody who only has cursory understanding.
 

Kinthalis

Banned
So, I'm a noob when it comes to stuff like this. However, would somebody explain a few things to me. Console games task the GPU to do the heavy lifting, right?

This is a very simplistic way of putting things. The most demanding task of modern game engines is usually that involved in 3D rasterization. That's, for the most part, the sole responsibility of the GPU. It's the same on PC and consoles.

If so, are PC games/ports similarly designed or is it fundamentally different - aka does the PC version task the CPU to do more?

For the most part no. Game develoeprs will usually task CPU's to do workloads that work better when handled by a CPU and will have the GPU handle workloads more appropriate for it.

GPU compute allows certain tasks to be offloaded ot the GPU which were traditionally done on CPU, but these workloads are very specific. They need to be compatible with the massive parallelization that the GPU brings with it in order to be worthwhile.

Again, this is true in both consoles and PC. However, consoles right now are more efficient with most GPU compute tasks since they can be done asynchronously and at the same time the GPU is working on other 3D related tasks. The same ability is coming to PC, and in the meantime hardware vendors like Nvidia offer solutions that work well with modern hardware.

Also, wouldn't having roughly 3.5 GB's more RAM help the PC?

Not in terms of performance, not unless the application is starved for RAM.
 

M3d10n

Member
You indirectly explained very well why the Carmack-quote is often unwittingly used as a red herring around this place.. I admit the op was somewhat unclear, but I also kinda feel that your post partly supports his apparent theory (if I understood him) that the Carmack quote isn't as directly usable to point out "PC-inefficiency" across the board as some think.

My intentions were two:

1) Showing that the Carmack quote isn't as widely applicable as people like it to be (which means there's no myth to bust), specially in the way the OP twisted it.

2) That there is some truth to the "myth", at least when it came to the PS360.

The "PC inefficiency" exists in the form of having to code a game to run in a wide range of hardware. Tinkering with shaders, textures and even placement of LOD transitions to fit within a particular GPU's deficiencies often isn't worth the trouble when you can simply up your minimum system requirements a notch or two and allow users to toggle stuff on and off.

The fixed nature of consoles also allow the developer to get away with hack-ish techniques which aren't guaranteed to work on different GPUs or even different driver versions. A good example is Rage: the engine's megatextures used the GPU in very unusual ways which made the existing PC drivers burst at their seams. On a console, "if it works, it works".

Nowadays there isn't much about the PC architecture that creates a "system wide inefficiency". If you make a PC game and spend time and effort making sure it runs at constant 60fps while showing the best possible visuals on an Intel HD5000, you definitely can. That's basically how modern arcade games work, since they have been Windows PCs for several years now.

Except that it plays loose and hard with the facts. Most comically stating that the 8800GT was a DX10 GPU and benefited from reduced drawcalls, which is only true if the game was actually made using DirectX 10 and all the optimizations it offered.
spoiler, that rarely happened until DX11 came along, and a majority of the popular games from the last console generation were DX9

But some of those architecture changes did affect DX9 games, as drivers were updated, because many of them are actually used under the hood by the drivers even in DX9 games, kinda like DX9 GPUs would actually use shaders under the hood when running DX7 and older games since they didn't had fixed-function T&L and combiner hardware anymore.

Also, using Tomb Raider 2013 is a bad idea because the game actually uses DX10/11 on the PC (unless you tested it on Windows XP). Again, Carmak's statement only holds water if you use the same hardware. If you move to better hardware the parameters change too much, specially if you move to different generation hardware. Try running TR2013 on a Radeon X1800 and see how well it fares. BTW, the minimum required AMD GPU for TR2013 is a Radeon HD 2600 XT, which is one generation greater than the GPUs 360. (Using "DBZ" power measurement, the 2600XT is 3X faster than the X1950 pro).
 
This is a very simplistic way of putting things. The most demanding task of modern game engines is usually that involved in 3D rasterization. That's, for the most part, the sole responsibility of the GPU. It's the same on PC and consoles.



For the most part no. Game develoeprs will usually task CPU's to do workloads that work better when handled by a CPU and will have the GPU handle workloads more appropriate for it.

GPU compute allows certain tasks to be offloaded ot the GPU which were traditionally done on CPU, but these workloads are very specific. They need to be compatible with the massive parallelization that the GPU brings with it in order to be worthwhile.

Again, this is true in both consoles and PC. However, consoles right now are more efficient with most GPU compute tasks since they can be done asynchronously and at the same time the GPU is working on other 3D related tasks. The same ability is coming to PC, and in the meantime hardware vendors like Nvidia offer solutions that work well with modern hardware.



Not in terms of performance, not unless the application is starved for RAM.

Thanks for the response, you are talking about GPGPU correct?

I'm just not sure I've seen an accurate comparison. If the goal is to test whether a high to mid level GPU from around the time of the Xbox 360-PS3 launch can still play game at comparable or better performance, then I'd say the answer is probably yes.

But if it's a general question of a PC built in the 2006-2007 with comparable specs, I've yet to see it. If the consoles operate with ~512MB of RAM, does that mean a computer with that - or even 1GB - is capable or running multiplatform games at the same performance of a console, I'm not so sure.

The reason for this question, I think, is to debunk the notion that either it takes a PC roughly twice the amount of computational power than a console to achieve roughly equal performance or that a console will eventually be able to produce better performance/graphics (or whatever general quality assessment) after a number of years. The former I think is overstated with some truth to it, but the ladder I tend to agree with.

As I feel the OP is mainly focusing on the ladder myth, I find it odd that the PC doesn't align with the rough specs of the consoles.

Anyways, like I said I'm not that technically competent so I struggle to meaningfully engage in topics like this; but I do find it interesting.
 

Wavebossa

Member
You really just use a 8800GT vs a PS3? I don't disagree with your premise but you lost all credibility with that comparison.
 

AmFreak

Member
Also, using Tomb Raider 2013 is a bad idea because the game actually uses DX10/11 on the PC (unless you tested it on Windows XP). Again, Carmak's statement only holds water if you use the same hardware. If you move to better hardware the parameters change too much, specially if you move to different generation hardware. Try running TR2013 on a Radeon X1800 and see how well it fares. BTW, the minimum required AMD GPU for TR2013 is a Radeon HD 2600 XT, which is one generation greater than the GPUs 360. (Using "DBZ" power measurement, the 2600XT is 3X faster than the X1950 pro).


A 2600XT is worse in every way measurable to the Xenos.
It has less shader power, less pixel power and less texel power.
It also has only 22.4GB/s of bandwidth (in it's slower version) compared to the Xenos 22.4GB/s (shared with the cpu) + 32GB/s to the daughter die (EDRam) + the 256GB/s the ROP's in the daughter die have to the EDRAM.
And it's also not a generation ahead of Xenos, it's a generation ahead of the 1xxx series. Xenos sits between the 2.
A comparison with a 2xxx series card is the much more honest one, cause the 1xxx series doesn't even have unified shader's.
 

danwarb

Member
You really just use a 8800GT vs a PS3? I don't disagree with your premise but you lost all credibility with that comparison.

Nope. The OP is comparing the relative performance of an 8800GT versus PS360 in more recent games/over its just as long life span.
 

Kinthalis

Banned
Nope. The OP is comparing the relative performance of an 8800GT versus PS360 in more recent games/over its just as long life span.

THIS.

People keep on harping that the 8800 GT isn't EXACTLY the same as an xbox 360 PS3 (keep in mind that in the cas eof the PS3, it's nto just the GPU that needs to be taken into account when comparing 3D rendering performance - the cell does a lot of shader work).

That's not the point. The point is that somethign that has similar hardware specs (even if a bit better) does not in any way shape or form perform 50% less efficiently than the comparable console hardware.

THAT is the myth. That you need double the performance to match console spec. In terms of GPU hardware that is patently, absurdly, DEMONSTRABLY (most importantly ;) ), false.

In terms of CPU power I think we start to see that this is likely true, especially when discussing ancient API's like DX9.
 

Kinthalis

Banned
Thanks for the response, you are talking about GPGPU correct?

I'm just not sure I've seen an accurate comparison. If the goal is to test whether a high to mid level GPU from around the time of the Xbox 360-PS3 launch can still play game at comparable or better performance, then I'd say the answer is probably yes.

That is the other half of the myth. The part that gets trotted out even more than the "Twice the spec!!! CARMACK!!!!!" stuff.

That through the magic of "secret sauce" or to the metal coding, console hardware eventually surpasses evenwhat high end PC hardware at the start of the gen can do.

Again, this shows that's NOT true, even with games coming out at the end of that console generation - better PC hardware almost as old as those consoles, STILL performs better.

Closed platform and low level API development (which is coming to PC this gen) can only do so much.
 

Wavebossa

Member
THIS.

People keep on harping that the 8800 GT isn't EXACTLY the same as an xbox 360 PS3 (keep in mind that in the cas eof the PS3, it's nto just the GPU that needs to be taken into account when comparing 3D rendering performance - the cell does a lot of shader work).

That's not the point. The point is that somethign that has similar hardware specs (even if a bit better) does not in any way shape or form perform 50% less efficiently than the comparable console hardware.

THAT is the myth. That you need double the performance to match console spec. In terms of GPU hardware that is patently, absurdly, DEMONSTRABLY (most importantly ;) ), false.

In terms of CPU power I think we start to see that this is likely true, especially when discussing ancient API's like DX9.

As I said, I don't disagree with the premise. But the point still stands that he should have used different hardware in his comparison. There already enough complications with comparing console hardware to pc hardware, why further complicate it by using a gpu that is 2 generations (yes, 2. 8800GT is a generation ahead of the 8800GTX, it is basically a 9XXX series card) ahead? The GPU is not similar at all and makes him lose credibility.
 

wachie

Member
Again, this shows that's NOT true, even with games coming out at the end of that console generation - better PC hardware almost as old as those consoles, STILL performs better.
Hardware that is better in some aspects more than thrice is performing better (wrt Carmack's 2x tweet), why is this surprising?

Again, I don't doubt the intention but like sane minds like dark10x have pointed out, the comparison's footing is all wrong. Hopefully dictator can come up with some better benchmarks for us to compare.
 

Kinthalis

Banned
As I said, I don't disagree with the premise. But the point still stands that he should have used different hardware in his comparison. There already enough complications with comparing console hardware to pc hardware, why further complicate it by using a gpu that is 2 generations (yes, 2. 8800GT is a generation ahead of the 8800GTX, it is basically a 9XXX series card) ahead?

You think he should of used the more powerful 8800 GTX instead?

Performance is what we're tryign to compare. As long as, ON PAPER, the performance is similar and the hardware features are also similar, I think the comparison is valid. We can then make estimations absed on what we know and expect given the minor hardware differences that DO exist.

I mean, maybe we CAN use a different GPU to frther illustrate the OP's point. What GPU would you suggest?
 

MadOdorMachine

No additional functions
So many people are missing the point of the OP which is that the console gained no advantage over a PC that was never upgraded. Specs don't matter. It doesn't matter if the PC was more powerful than PS3/360 or not. What matters is that the same PC retained it's performance level over the life of the console. Consoles have an advantage of being in a closed system. Theoretically, as developers learn the architecture over time the console should be able to do things not possible on PC. OP proved this isn't the case because games that were released later in the generation still retained the same performance advantage on the PC as it did at the beginning.

Ironically, it actually works the other way as well. In 2007 (and a few years after) Crysis was the most demanding PC game available. The fact that it was ported to consoles - something most didn't think was possible - is proof that the performance increases were pretty close to the uniform across console and PC. In other words, optimizations were made that brought the console closer to the level of the PC. Both console and PC saw roughly the same performance increase over the lifespan without one gaining a real advantage on the other.

As much as people love to say consoles are holding back PC, I argue that isn't true. Consoles are still needed because they provide a way for developers to optimize to set specs whereas on PC they would likely lean toward just requiring more power. That's a good thing because it means you should be able to drop a $150 GTX 750 into your PC and be able to run any PS4/XB1 game at near the same quality for the life span of the console. Because consoles are now more than ever closer to a PC in specs, it also means we might see major strides taken in multi core processors going forward. I imagine this would take a lot longer otherwise.
 

Xiraiya

Member
2x just at a glance seems like an exaggeration, but I mean who cares.

In like 2006 or maybe 2007, I forget, I bought a PC with a 7950GT and a AMD Athlon 64x2 Dual Core 5200+
I played the Mass Effect series on it on what would be "High" reletively fine, even played Oblivion and Skyrim, Skyrim ran better.
Off the top of my head I have no idea what the specs in a PS3/360 are anymore but I'd say if I could play PS3/360 games just fine, then there you go.

At the same time though, a console is designed around specifically playing those games, so it's not unbelievable to suggest they might be more efficient, they were back in the PS2/Gamecube days that's for sure, you couldn't have been able to run much if you tried to mimic a console with a PC in 1:1 specs.
 

Kinthalis

Banned
Hardware that is better in some aspects more than thrice is performing better (wrt Carmack's 2x tweet), why is this surprising?

Again, I don't doubt the intention but like sane minds like dark10x have pointed out, the comparison's footing is all wrong. Hopefully dictator can come up with some better benchmarks for us to compare.

According to the twice the spec people, ti shoudl be performing MUCH closer to an 8800 GT than it actually does though.

So we know twice the spec (for GPU's IS wrong). The question is: how much more performance can you realistically get out of a rendering engine on a closed spec, low level API machine?

I think a much more realistic range is anywhere from 0%-30% improvement depending on the time, the skill, and the budget of the game in question.

That's probably about the same difference you're going to see from a PS4 to an Xbone, which according to many console gamers is "barely noticeable".
 

There is no best equivalent for Xenos or that POS in the PS3. They were actually customized chips (in the 360s favor and to the PS3s disadvantage).

Best would be to find something that just performs similarly in one game as a baseline. Or something that performs exactly 2x as good, and then measure how it scales with games over time.

So an 8800 GT. Take game from first part of gen at console like settings and measure FPS.

Then to a game later in the gen at console like settings and measure FPS. If the differential in performance from PC to console remains similar... the myth is busted?
 
I see many are still missing the point. This is not a like for like comparison of hardware. Its about peoples belief that some how, magically, current PC hardware will be suffering at the end of the current console generation while consoles will still be going strong. This isn't the case. My 8800GT is faster than old consoles it was then and it is now. Just like today's cards will still be out performing PS4 and XB1 7 years from now.

I just work up and might be able to down clock and run a bench before work, but no promises.
 

Unai

Member
I see many are still missing the point. This is not a like for like comparison of hardware. Its about peoples belief that some how, magically, current PC hardware will be suffering at the end of the current console generation while consoles will still be going strong. This isn't the case. My 8800GT is faster than old consoles it was then and it is now. Just like today's cards will still be out performing PS4 and XB1 7 years from now.

I just work up and might be able to down clock and run a bench before work, but no promises.

This.

Edit: Didn't notice you are the OP.
 

Mohasus

Member

I may be wrong but here is what I think this thread is about.
It is still common to see comments like "you'll have to upgrade your PC anyway because it'll be obsolete in 3 or 4 years while my PS4 will last the whole generation. You don't see people gaming in 2007 rigs anymore".
Comparing similar hardware isn't the point and shouldn't be. I can't buy a 8-core jaguar APU for my PC. What I can buy is an i3/i5 or whatever.

It would be like me doing a benchmark in 2020 (or whenever this generation ends) using my current PC even though I heard that I'd need to upgrade it at least twice to keep up with latest games.
 

UnrealEck

Member
A lot of console games will have tweaked graphics settings in more detail. They'll change settings of things like say LOD distance using precise numbers in the console and settle on what they think looks best and performs best. That's why even comparing graphics settings on PC is so hard to do. I'm not sure why people seem to think hardware in a console defies theoretical output.
 
I see many are still missing the point. This is not a like for like comparison of hardware. Its about peoples belief that some how, magically, current PC hardware will be suffering at the end of the current console generation while consoles will still be going strong. This isn't the case. My 8800GT is faster than old consoles it was then and it is now. Just like today's cards will still be out performing PS4 and XB1 7 years from now.

I just work up and might be able to down clock and run a bench before work, but no promises.

To me, the best way to do a comparison like this is thusly:

Take a PC from 2006-2007 without any hardware upgrades. Then, take a multiplatform game that was on PC and consoles at the same time in the 2006-2007 time period and get a baseline for the visuals/performance for both the PC and console.

Then, take a multiplatform game from 2013-2014 that appeared on both PC and consoles at the same time, and get a baseline on visuals/performance for both.

Then compare those two baselines. I have a feeling that the console baseline will gain on the PC baseline, especially if more than one game is tested to reduce the chance of poor ports.

Some things to take into account: the higher the specs of a computer, the more future proof it is in relation to consoles. If you had a really high end GPU, good CPU, and a lot of RAM back in 2007, it is logically going to take longer for the console "myth" of overtaking said computer to prove true (if at all). Also, the right OS and reducing of processing (and whatever other mitigating factors) all can help increase the longevity of a PC.
 
To me, the best way to do a comparison like this is thusly:

Take a PC from 2006-2007 without any hardware upgrades. Then, take a multiplatform game that was on PC and consoles at the same time in the 2006-2007 time period and get a baseline for the visuals/performance for both the PC and console.

Then, take a multiplatform game from 2013-2014 that appeared on both PC and consoles at the same time, and get a baseline on visuals/performance for both.

Then compare those two baselines. I have a feeling that the console baseline will gain on the PC baseline, especially if more than one game is tested to reduce the chance of poor ports.

Some things to take into account: the higher the specs of a computer, the more future proof it is in relation to consoles. If you had a really high end GPU, good CPU, and a lot of RAM back in 2007, it is logically going to take longer for the console "myth" of overtaking said computer to prove true (if at all). Also, the right OS and reducing of processing (and whatever other mitigating factors) all can help increase the longevity of a PC.

Feel free to do that if you wish. I just happened to have a HTPC with the Celeron already installed up and running. All I had to do was install the 8800GT.

I don't understand the infatuation with having exact hardware to compare. We can make extrapolations from the data given.

I've looked at Alan Wake and that's going to be a tough one. It doesn't have a built in benchmark and the X360 version runs at 960x544. The PC options only allow for 720p. Bioshock Infinite has a built in benchmark, but I have to try and figure out what the equivalent console settings are. The resolution will be a problem as well, because neither the Xbox 360(1152x720) nor the PS3(1152x640) run at a full 720p.

With that being said I'm going to do my best to show some Bioshock Infinite after work.

Edit: Bioshock Infinite allows for custom resolutions in the benchmark.
 
Then why did you bring up the Carmack quote? It's only related to the former, not the latter.

Because it is misused by many and used as a bullet point that console optimizations are some magic coding to the metal trick, because I mean look what John Carmack said. If the Carmack quote was true in that context then the PS360s should be out performing old PC hardware by a large margin.
 
I don't understand the infatuation with having exact hardware to compare. We can make extrapolations from the data given.

It's not necessarily about having the exact same hardware. The premise is that because developers can optimize their game engines and all the other coding wizardry that comes from working on a fixed platform, that a developer can - in essence - eke out more performance/visuals out of a console than what a computer that is as old as a console can handle, if not more so.

So it's one thing to compare a PC with a GPU that's relatively comparable to last gen consoles based on a decently new game. But that doesn't exactly prove your point, unless it's specifically that a PC with a pretty good GPU from after the Xbox 360 launched along with reasonable specs can perform on the same level, if not better, than console.

My scenario would be gauging whether or not consoles are more capable than a PC built around the time the console launches. The reason is because if a console in fact is more capable in visuals/performance, then that proves the whole "coding wizardry" myth is in fact true. It's probably going to be less true than ever this generation, but it remains a theory many disprove.

I personally subscribe to the fixed platform theory, which is why I'd like to see a more conclusive comparison. Of course, you can only work with what you've got, so no worries OP :)

But don't claim the myth is busted unless the testing covers all the bases.
 
If you are super-sampling, stop it. Also, remove tressFX.


whoops, my bad, it wasn't tesselation but tressFX that i wanted to say in my post! Also supersampling is already off too

thanks anyway, i plan to change pieces of my pc anyway for witcher 3 and future games
 

danwarb

Member
You'll have to turn down the high-res PC textures for Bioshock Infinite. The PC game is twice the size of the console version.
 
It's not necessarily about having the exact same hardware. The premise is that because developers can optimize their game engines and all the other coding wizardry that comes from working on a fixed platform, that a developer can - in essence - eke out more performance/visuals out of a console than what a computer that is as old as a console can handle, if not more so.

So it's one thing to compare a PC with a GPU that's relatively comparable to last gen consoles based on a decently new game. But that doesn't exactly prove your point, unless it's specifically that a PC with a pretty good GPU from after the Xbox 360 launched along with reasonable specs can perform on the same level, if not better, than console.

My scenario would be gauging whether or not consoles are more capable than a PC built around the time the console launches. The reason is because if a console in fact is more capable in visuals/performance, then that proves the whole "coding wizardry" myth is in fact true. It's probably going to be less true than ever this generation, but it remains a theory many disprove.

I personally subscribe to the fixed platform theory, which is why I'd like to see a more conclusive comparison. Of course, you can only work with what you've got, so no worries OP :)

But don't claim the myth is busted unless the testing covers all the bases.

It's cool I'm having fun either way. I updated the OP.

I was messing around with the BioShock Infinite benchmark and it allows for custom resolutions. I plan to set the game to the same resolution as the X360 version I will also try to use equivalent settings(this is where things get tricky) I'll probably just run presets and maybe one custom setting that I feel is appropriate along with a down clock that's the best I can do without a different GPU. I'm at work ATM so it's going to be at least 9 hours before I can do it.

According to Chris Kline in this IGN article

http://m.ign.com/articles/2013/01/15/bioshock-infinite-the-pc-version-difference

Chris Kline: Playing the PC version on Medium settings is fairly close to the console version, though higher quality in a few areas. As you go up from there to the High, Very High, and Ultra settings the difference is enormous.

So I'm going to go with Medium settings and see what that gets me.
 

MadOdorMachine

No additional functions
To me, the best way to do a comparison like this is thusly:

Take a PC from 2006-2007 without any hardware upgrades. Then, take a multiplatform game that was on PC and consoles at the same time in the 2006-2007 time period and get a baseline for the visuals/performance for both the PC and console.

Then, take a multiplatform game from 2013-2014 that appeared on both PC and consoles at the same time, and get a baseline on visuals/performance for both.

Then compare those two baselines. I have a feeling that the console baseline will gain on the PC baseline, especially if more than one game is tested to reduce the chance of poor ports.

Some things to take into account: the higher the specs of a computer, the more future proof it is in relation to consoles. If you had a really high end GPU, good CPU, and a lot of RAM back in 2007, it is logically going to take longer for the console "myth" of overtaking said computer to prove true (if at all). Also, the right OS and reducing of processing (and whatever other mitigating factors) all can help increase the longevity of a PC.

I'm not trying to be rude, but this is silly and now I can see why the OP was so irritated earlier. Everything you said was either in the OP or in the post I made just above yours, not to mention other comments throughout the thread. I can deal with people not understanding something, but it appears some people simply aren't reading before posting.

- The OP used a PC w/2006/2007 parts. Bear in mind also that even at the time (2007) this PC would have been considered mid-range and built for under $1000 w/all parts and software included. Although it's admittedly more powerful than the consoles, the consoles had an advantage (particularly the 360) those first two years and it wasn't until around this time that PC could match or exceed consoles at an affordable price. On that note, it was also around this time that we finally started seeing console games that truly looked next gen.

- You can't compare apples to apples hardware on console and PC. It's just not possible because the consoles back then were highly customized computers built from the ground up. The difference today is that the PS4/XB1 are stripped down versions of a PC. All the optimizations made on PC hardware over the past 7 years were directly transferred to console. This is in direct contrast to how things have been in the past where consoles were ahead of their time or "future proofed" compared to PC at the time of its release.

- Crysis came out in fall 2007 and was PC exclusive. The console version came out in fall 2011. Having played Crysis w/a similar rig (Dual Core Pentium D, 8800GT and probably 2 GB or ram) and the 360 version, I can tell you the PC version would have been marginally better than the console version. The biggest difference being resolution if memory serves.

- A game like Gears of War would have very similar results being released in 2006/2007 on 360/PC.

- Tomb Raider was released in 2012 for both console and PC. Again the difference in performance appears to the same gap between console and PC as Crysis and Gears of War.

- All three of these games prove that optimization on one platform benefitted the other, so console games didn't gain any ground over PC. They both benefitted the same. Imo, Crysis is a great example because a console port wasn't possible until Crytek updated from CryEngine 2 to 3. CryEngine 3 and Tomb Raiders Engine could both be considered "Next Gen Ready" at the time these games were released. Gears of War (and UE3) benefitted so heavily last gen because the engine was made for both platforms early on last gen and continued to be updated and improved on for years.

- Specs are really irrelevant, but the example he used is pretty close (although admittedly more powerful) to the PS3/360. The performance advantage of the PC retained roughly the same advantage over the life of the console. The same disadvantage for PC would have happened if it had lower specs.

- It's too early to say whether the closed nature of consoles will still push graphics and performance ahead like it has in the past. Imo, it's foolish to think it wouldn't though. I expect to see the same thing this gen as we have every other gen. As time goes on and developers get more familiar with the hardware, they will start taking more advantage of it and we will see things that hadn't been possible before. Due to the architecture of all these systems, it should transfer over to PC even easier than before and benefit everyone including what we will see in PS5/Xbox 4.

- Currently computer hardware power improvements have drastically outperformed the software. It's why we've see such a huge increase in performance in mobile devices vs. PC-like devices. The software has not been able to keep up with the hardware. Consoles may be the catalyst to finally move the software forward. I haven't really been paying attention to next gen engines like UE4 or Unity 5, but hopefully they have been built with multi cores in mind. Imo, the CPU is the area that has been the weakest in taking advantage of its capabilities. In other words, they haven't even scratched the surface on what PC CPUs that have been available for the last 5 years or so are really capable of.
 
No matter how many times people interpret it that way, Carmack did not say, or imply, that 3x more powerful hardware would not hold an advantage over time. He only talked about equal hardware to the PS3 and 360. That's it; it was a special circumstance.

The 8800gt can already demolish the 7800gtx in some tests, and the PS3 GPU is even worse than the 7800gtx.

Would the PS3 outperform a PC that was downgraded across the board including a partially disabled 7800gtx? Maybe, maybe not. But testing an 8800gt does nothing to prove it either way. For example, I can find tests in which the 8800gt scores nearly 5x better than the 7800gtx. So if you show a test in which the 8800gt scores 2x better than a ps3, that could be seen as good evidence in favor of Carmack. (until you go back and make relevant tests between more equal hardware, which may totally bust his claim)
 

Naminator

Banned
Feel free to do that if you wish. I just happened to have a HTPC with the Celeron already installed up and running. All I had to do was install the 8800GT.

I don't understand the infatuation with having exact hardware to compare. We can make extrapolations from the data given.

I've looked at Alan Wake and that's going to be a tough one. It doesn't have a built in benchmark and the X360 version runs at 960x544. The PC options only allow for 720p. Bioshock Infinite has a built in benchmark, but I have to try and figure out what the equivalent console settings are. The resolution will be a problem as well, because neither the Xbox 360(1152x720) nor the PS3(1152x640) run at a full 720p.

With that being said I'm going to do my best to show some Bioshock Infinite after work.

Edit: Bioshock Infinite allows for custom resolutions in the benchmark.

Just make a custom resolution in the Nvidia control panel dude, I think that should work.
No matter how many times people interpret it that way, Carmack did not say, or imply, that 3x more powerful hardware would not hold an advantage over time. He only talked about equal hardware to the PS3 and 360. That's it; it was a special circumstance.

The 8800gt can already demolish the 7800gtx in some tests, and the PS3 GPU is even worse than the 7800gtx.

Would the PS3 outperform a PC that was downgraded across the board including a partially disabled 7800gtx? Maybe, maybe not. But testing an 8800gt does nothing to prove it either way. For example, I can find tests in which the 8800gt scores nearly 5x better than the 7800gtx. So if you show a test in which the 8800gt scores 2x better than a ps3, that could be seen as good evidence in favor of Carmack. (until you go back and make relevant tests between more equal hardware, which may totally bust his claim)
Ok thats great, but how would one get a hold of a Cell Processor, and software that utilizes it?

Seems to me, that throughout this thread people who want "equal hardware" to be tested, only want it when it vaguely benefits their argument.
 
No matter how many times people interpret it that way, Carmack did not say, or imply, that 3x more powerful hardware would not hold an advantage over time. He only talked about equal hardware to the PS3 and 360. That's it; it was a special circumstance.

The 8800gt can already demolish the 7800gtx in some tests, and the PS3 GPU is even worse than the 7800gtx.

Would the PS3 outperform a PC that was downgraded across the board including a partially disabled 7800gtx? Maybe, maybe not. But testing an 8800gt does nothing to prove it either way. For example, I can find tests in which the 8800gt scores nearly 5x better than the 7800gtx. So if you show a test in which the 8800gt scores 2x better than a ps3, that could be seen as good evidence in favor of Carmack. (until you go back and make relevant tests between more equal hardware, which may totally bust his claim)

I'm really regretting using that Carmack quote, because it's what people are focusing on. I've updated the OP to say this, but I'm not trying disprove Carmack's quote and others have explained better than I have. I am pointing how its miss used, which because of my lack of clarity has made it appear that I am miss using it.
You seem focused on the RSX, which I have updated the OP to comment on, but I feel I should put it in the tail of the thread as well. It's to hard to compare PC parts to the PS3s exotic structure considering the Cell is used to help out its weak GPU. So I am focusing on the Xbox 360. The 360s more straight forward CPU, GPU and memory configuration makes for an easier comparison.
In light of that. Wikipedia shows the Xenos at 240Gflops and the 8800GT at 500Gflops. I'm going to down clock to my GPU by half and I'm going to run BioShock infinite at 1152x720 and medium settings to best emulate the 360. That's the best I can do.

Just make a custom resolution in the Nvidia control panel dude, I think that should work.

Cool I'll do that for Alan Wake. It doesn't have a benchmark built in so I might have to take video. Which will be difficult to upload on my slow connection, but I'm going to try to make something legit.
 
I'm not trying to be rude, but this is silly and now I can see why the OP was so irritated earlier. Everything you said was either in the OP or in the post I made just above yours, not to mention other comments throughout the thread. I can deal with people not understanding something, but it appears some people simply aren't reading before posting.

- The OP used a PC w/2006/2007 parts. Bear in mind also that even at the time (2007) this PC would have been considered mid-range and built for under $1000 w/all parts and software included. Although it's admittedly more powerful than the consoles, the consoles had an advantage (particularly the 360) those first two years and it wasn't until around this time that PC could match or exceed consoles at an affordable price. On that note, it was also around this time that we finally started seeing console games that truly looked next gen.

- You can't compare apples to apples hardware on console and PC. It's just not possible because the consoles back then were highly customized computers built from the ground up. The difference today is that the PS4/XB1 are stripped down versions of a PC. All the optimizations made on PC hardware over the past 7 years were directly transferred to console. This is in direct contrast to how things have been in the past where consoles were ahead of their time or "future proofed" compared to PC at the time of its release.

- Crysis came out in fall 2007 and was PC exclusive. The console version came out in fall 2011. Having played Crysis w/a similar rig (Dual Core Pentium D, 8800GT and probably 2 GB or ram) and the 360 version, I can tell you the PC version would have been marginally better than the console version. The biggest difference being resolution if memory serves.

- A game like Gears of War would have very similar results being released in 2006/2007 on 360/PC.

- Tomb Raider was released in 2012 for both console and PC. Again the difference in performance appears to the same gap between console and PC as Crysis and Gears of War.

- All three of these games prove that optimization on one platform benefitted the other, so console games didn't gain any ground over PC. They both benefitted the same. Imo, Crysis is a great example because a console port wasn't possible until Crytek updated from CryEngine 2 to 3. CryEngine 3 and Tomb Raiders Engine could both be considered "Next Gen Ready" at the time these games were released. Gears of War (and UE3) benefitted so heavily last gen because the engine was made for both platforms early on last gen and continued to be updated and improved on for years.

- Specs are really irrelevant, but the example he used is pretty close (although admittedly more powerful) to the PS3/360. The performance advantage of the PC retained roughly the same advantage over the life of the console. The same disadvantage for PC would have happened if it had lower specs.

- It's too early to say whether the closed nature of consoles will still push graphics and performance ahead like it has in the past. Imo, it's foolish to think it wouldn't though. I expect to see the same thing this gen as we have every other gen. As time goes on and developers get more familiar with the hardware, they will start taking more advantage of it and we will see things that hadn't been possible before. Due to the architecture of all these systems, it should transfer over to PC even easier than before and benefit everyone including what we will see in PS5/Xbox 4.

- Currently computer hardware power improvements have drastically outperformed the software. It's why we've see such a huge increase in performance in mobile devices vs. PC-like devices. The software has not been able to keep up with the hardware. Consoles may be the catalyst to finally move the software forward. I haven't really been paying attention to next gen engines like UE4 or Unity 5, but hopefully they have been built with multi cores in mind. Imo, the CPU is the area that has been the weakest in taking advantage of its capabilities. In other words, they haven't even scratched the surface on what PC CPUs that have been available for the last 5 years or so are really capable of.

I don't know why it's so silly. What is wrong with establishing a criteria, a baseline from the beginning of the console generation and comparing it to the "end" of the generation? To me it's seems the most reasonable way to address this myth. Beyond that, I'm not entirely sure how you approach the rest of the internals of the PC.

The whole idea of the myth is that a fixed platform has larger improvements over a long period of time that the PC platform. That is not solely dependent on the GPU, but the overall build. And I don't think many people would claim that a beastly rig from 2007 can't compete with a console, because that's silly.

So it's hard to contextualize the debate because nobody really agrees on what the exact question is. But let's at least try to be semi scientific about it.
 
While I kind of see where you're coming from it doesn't make any sense. PCs are still the same basic structure they've always been. You have a CPU, Memory, buses, and a video card. It doesn't matter when it was made all that really matters in this context is the theoretical performance of each part. All we can do is use the best facsimile available.
 

Vaporak

Member
. And I don't think many people would claim that a beastly rig from 2007 can't compete with a console, because that's silly.

You say it's silly, but console fanboys walk into threads and routinely claim exactly that. Which is why this thread exists in the first place, to disprove the claim that PC hardware from earlier in the console generation isn't up to the task of playing games anymore.
 

Armaros

Member
And I don't think many people would claim that a beastly rig from 2007 can't compete with a console, because that's silly..

Then you dont actually understand the reason why this thread was made.

Because of those exact types of nonsensical arguments whenever console optimization is brought up vs PC game optimization constantly occur.
 
BioShock Infinite Benched

Another test

Intel Celeron G530@2.4GHz
ASUS P8H61-M
4 GB GSKILL DDR3@1333
MSI NX8800GT OC

I ran benchmarks and took some video for comparison. I used medium settings based off a quote from this Article
Chris Kline: Playing the PC version on Medium settings is fairly close to the console version, though higher quality in a few areas. As you go up from there to the High, Very High, and Ultra settings the difference is enormous.
and an 1152x720 resolution based off the BioShock Infinite Face-Off
[Update: after spending more time with the game and taking another look at the assets, we now reckon that Xbox 360 renders at 1152x720 with PS3 coming in at 1152x640.]
First Benchmarks.

8800GT Stock clock (660mhz) with an unlocked frame rate.
Per Scene Stats:
Scene Duration (seconds), Average FPS, Min FPS, Max FPS, Scene Name
32.58, 62.76, 24.27, 222.63, Welcome Center
7.16, 48.40, 15.61, 111.62, Scene Change: Disregard Performance In This Section
21.46, 50.96, 18.14, 100.32, Town Center
8.21, 46.31, 22.31, 76.81, Raffle
9.11, 58.75, 13.56, 142.83, Monument Island
3.54, 62.68, 46.49, 74.02, Benchmark Finished: Disregard Performance In This Section
82.05, 56.34, 13.56, 222.63, Overall

8800GT Half clock (330mhz) with an unlocked frame rate

Per Scene Stats:
Scene Duration (seconds), Average FPS, Min FPS, Max FPS, Scene Name
32.42, 38.44, 17.29, 150.18, Welcome Center
6.82, 28.43, 9.61, 62.61, Scene Change: Disregard Performance In This Section
22.29, 27.56, 15.11, 41.40, Town Center
7.82, 25.08, 19.00, 38.94, Raffle
9.19, 33.21, 14.98, 300.73, Monument Island
3.06, 33.93, 29.93, 38.47, Benchmark Finished: Disregard Performance In This Section
81.60, 32.62, 9.61, 300.73, Overall

8800GT Half clock (330mhz) with a locked frame rate

Per Scene Stats:
Scene Duration (seconds), Average FPS, Min FPS, Max FPS, Scene Name
32.40, 30.00, 15.55, 355.12, Welcome Center
7.31, 25.66, 15.68, 30.78, Scene Change: Disregard Performance In This Section
21.78, 26.99, 14.87, 35.01, Town Center
7.81, 24.32, 19.06, 33.62, Raffle
9.37, 29.99, 21.70, 48.74, Monument Island
3.13, 30.00, 27.86, 32.37, Benchmark Finished: Disregard Performance In This Section
81.81, 28.26, 14.87, 355.12, Overall

So far so good. I didn't play the game for a long time and have no saves, but I managed to find a point of comparison early in the game.
This is the Digital Foundry clip at the same point my video takes place.

My video is not nearly as good looking and was taken from the same machine I was benchmarking from. I thought about upping the quality, but that might have effected the test and would taken longer to upload.

Screen Grab of the down clock
kry2P5q.jpg

So here is my video
https://www.youtube.com/watch?v=WLk8Tcygtls

You can see early in the video when I'm looking at the settings that the Texture Filtering and Dynamic Shadows are set to high under medium settings. This I think matches with Chris Klines quote.It's really hard to see the frame rate counter and I now wish I had made the counter bigger, but I just thought of it. I have posted an updated video.

Some Notes
While walking around the game I held 28-30 fps pretty consistent, but it would dip to 25-27 fps occasionally.

I monitored my video ram usage and it would start around 300MB and eventually creep up to 500MB, and stay there. It seems like it just fills up and starts culling after that. Not a big deal, but something I noticed. The Xbox 360 has 32MB of it's 512MB devoted to the OS. This leaves 480MB for games. I assume it does the same thing as my PC and stores textures and such until it reaches it's limit and then starts culling.


Update:

New higher res version of my video is posted
 
Top Bottom