Beerman462
Member
Updated OP
Inlight of the ongoing The Geforce 750 Ti shouldn't be capable of matching or beating the PS4 GPU thread I decided to perform an experiment. I still have a working 8800GT and my HTPC uses a Sandy Bridge G530 dual core which according to benchmarks puts it in about the same category as Core 2 Duos.
I know the 8800GT is a higher end part compared to the PS3 and Xbox 360 GPUs, but it's all I have and I still think that it's close enough to extrapolate some meaningful data in this context.
There are more than a few people on this board that believe in the promise of coding to the metal philosophy and the oft quoted John Carmack
So my goal is to prove or disprove these beliefs with cold hard facts.
So my test set up is as follows
Intel Celeron G530@2.4GHz
ASUS P8H61-M
4 GB GSKILL DDR3@1333
MSI NX8800GT OC
Generic settings
I am blown away by these tests and can't believe how well the 8800GT performs on such a modern game.
Update:
I removed my Steam list, because I will be concentrating on Bioshock Infinite due to it's awesome in engine benchmark. It will allow me to use custom resolutions to account for the custom resolutions on the PS360 and I will down clock my 8800GT to try and best mimick the console hardware. I don't think this is necessary, but for the sake of science I will do it. I don't know how it's going to go, because I can't account fully for things like the EDRAM, but it's easier to match GPUs with the Xbox, because the PS3 has to utilize the cell for some graphics tasks and I can't account for that at all.
To try and clear things up, I'm quoting myself
This gif is not meant as dispelling Carmacks quote as false. It is the myth of PC hardware becoming obsolete in comparison to console hardware.
Inlight of the ongoing The Geforce 750 Ti shouldn't be capable of matching or beating the PS4 GPU thread I decided to perform an experiment. I still have a working 8800GT and my HTPC uses a Sandy Bridge G530 dual core which according to benchmarks puts it in about the same category as Core 2 Duos.
I know the 8800GT is a higher end part compared to the PS3 and Xbox 360 GPUs, but it's all I have and I still think that it's close enough to extrapolate some meaningful data in this context.
There are more than a few people on this board that believe in the promise of coding to the metal philosophy and the oft quoted John Carmack
So my goal is to prove or disprove these beliefs with cold hard facts.
So my test set up is as follows
Intel Celeron G530@2.4GHz
ASUS P8H61-M
4 GB GSKILL DDR3@1333
MSI NX8800GT OC
My first game for now is Tomb Raider. This is all at 720p which is the same as PS360 for this game.
Generic settings
Normal Settings
High Settings
Custom/Ultra
I am blown away by these tests and can't believe how well the 8800GT performs on such a modern game.
Update:
I removed my Steam list, because I will be concentrating on Bioshock Infinite due to it's awesome in engine benchmark. It will allow me to use custom resolutions to account for the custom resolutions on the PS360 and I will down clock my 8800GT to try and best mimick the console hardware. I don't think this is necessary, but for the sake of science I will do it. I don't know how it's going to go, because I can't account fully for things like the EDRAM, but it's easier to match GPUs with the Xbox, because the PS3 has to utilize the cell for some graphics tasks and I can't account for that at all.
To try and clear things up, I'm quoting myself
I see many are still missing the point. This is not a like for like comparison of hardware. Its about peoples belief that some how, magically, current PC hardware will be suffering at the end of the current console generation while consoles will still be going strong. This isn't the case. My 8800GT is faster than old consoles it was then and it is now. Just like today's cards will still be out performing PS4 and XB1 7 years from now.
This gif is not meant as dispelling Carmacks quote as false. It is the myth of PC hardware becoming obsolete in comparison to console hardware.