• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Metro Redux (Console Analysis)

Qassim

Member
It looks like they've done a rather good job with them. Although I'm not sure if it is just me misremembering, but the overall image doesn't look that sharp. I'm not sure if it is intentional, as it looks a little foggy.

As we have seen particle effects in next-gen console games that were previously really only seen to that extent in PhysX, I was wondering if we'd see some of that stuff in the console versions too (and then standard in the engine on the PC version too, rather than just PhysX) as the Metro Last Light PhysX stuff was pretty good.

As much as think PCs are superior to consoles for gaming, I don't think that your 780ti will out perform a PS4 in, say, 5 years time. Think about it like this, it's like saying that a GeForce 7950 GT would out perform the PS3 and 360 for their whole generation. That said, a 780ti will probably out perform next-gen consoles for multi-plat games for 4-5 years

I think sometimes people mistake the 'better' graphics on consoles as they go through their generation as purely finding ways to do the same thing faster (e.g. say, Uncharted 1 would with all the same graphical features would run much faster in 5 years with those optimisations than it did at release). But the reality is, most of the optimisations are around better uses of the same resources. e.g. The Last of Us uses a less expensive AA method than Uncharted 1, GTAV has tighter limits when it comes to leaving bodies around (small example)).

I'd say the big gains in learning the system end after 2 years or so, the optimisations from there on are primarily of trying to find ways to reduce the use of resources on things that aren't as big of a visual pay-off (unfortunately often means good IQ towards the end of the generation).

A top end GPU from around the time of PS3 (e.g. the 8800GT) would run games at console settings or slightly higher all the way through the generation. As mentioned earlier, the difference this gen is that the top end PC GPUs are multiple times more powerful than the consoles, which wasn't the case last time.
 

ss_lemonade

Member
As much as think PCs are superior to consoles for gaming, I don't think that your 780ti will out perform a PS4 in, say, 5 years time. Think about it like this, it's like saying that a GeForce 7950 GT would out perform the PS3 and 360 for their whole generation. That said, a 780ti will probably out perform next-gen consoles for multi-plat games for 4-5 years
If you're using the 780ti for comparison with the ps4, shouldn't you be using something like the 8800 gtx when comparing to the previous gen consoles? I think that (or any of the 8800 gpus) easily outperform both the 360 and ps3
 

low-G

Member
I benchmarked my OC'ed GTX 570 + 2500k on battlefield 4 campaign and it actually slightly outperformed the PS4 on roughly equivalent settings. I'll see if I can dig up the graphs again but these consoles are quite underpowered.

A 780ti will smash the PS4/X1 throughout its lifetime.

I wouldn't doubt the GTX570 could outperform the PS4 in BF4, problem is the PS4 has a more powerful GPU even ignoring code-to-metal optimization! (~1.4 tflops vs ~1.8 in the PS4) Battlefield 4 was a sloppy port job to consoles. Not saying a 780ti won't outclass a PS4, that is a serious video card. But there's no excuse for BF4's performance on console.
 

SapientWolf

Trucker Sexologist
I wouldn't doubt the GTX570 could outperform the PS4 in BF4, problem is the PS4 has a more powerful GPU even ignoring code-to-metal optimization! (~1.4 tflops vs ~1.8 in the PS4) Battlefield 4 was a sloppy port job to consoles. Not saying a 780ti won't outclass a PS4, that is a serious video card. But there's no excuse for BF4's performance on console.
The console CPUs are ass. And there's a lot going on in a multiplayer match.
 

RoboPlato

I'd be in the dick
I think sometimes people mistake the 'better' graphics on consoles as they go through their generation as purely finding ways to do the same thing faster (e.g. say, Uncharted 1 would with all the same graphical features would run much faster in 5 years with those optimisations than it did at release). But the reality is, most of the optimisations are around better uses of the same resources. e.g. The Last of Us uses a less expensive AA method than Uncharted 1, GTAV has tighter limits when it comes to leaving bodies around (small example)).

I'd say the big gains in learning the system end after 2 years or so, the optimisations from there on are primarily of trying to find ways to reduce the use of resources on things that aren't as big of a visual pay-off (unfortunately often means good IQ towards the end of the generation).
Bolded is false. Drake's Fortune had no AA. TLoU and Uncharted 3 used the same dynamic post process solution, it was just off more often in TLoU to save performance. The only game ND released with a better AA method was Uncharted 2, which used 2x MSAA.
 

Truespeed

Member
looks like the difference between a 360 and ps3 version of a game except the frame rates are the same.

That's exactly what I thought, but the lower resolution PS3 port was usually due to developer incompetence, the B-team or budget constraints. This is more of a raw horsepower gap that the Xbox One will never be able to catch up to.
 

NBtoaster

Member
Bolded is false. Drake's Fortune had no AA. TLoU and Uncharted 3 used the same dynamic post process solution, it was just off more often in TLoU to save performance. The only game ND released with a better AA method was Uncharted 2, which used 2x MSAA.

Uncharted 1 used MSAA as well.
 

stryke

Member
To maybe expand on that, and make it a little more precise.

//=====================

Here are the histograms for the 2nd comparison pic in the zoomed comparison.
PS4 top, XB1 bottom.

u5HwNf2.png


(The big spike on the left side of the PS4 distribution is also present on the XB1 image, it's just sitting on the left border because I framed these images really poorly.)

The PS4 image contents look very similar to XB1's, except compressed within a range that looks suspiciously like the bounds for the "LDR" content in limited-range RGB. There's a little bit of image content outside of that range; some of this is the "PS4" text on the image, the rest is pretty small and looks as though it's not much more than jpeg-related histogram bleeding.

Images 1 and 4 in the comparison have a similar thing going on.

//=====================

Images 3 and 5 are different. Image 3, the one with the snowy car, has this histogram, which looks more similar between PS4 and XB1:

CCFzLSO.png


//=====================

tl;dr DigitalFoundry probably captured some of the PS4 images from an RGB Limited source with an RGB Full receiver.

Why do they feel the need to fiddle with their set up...

New Article

6h5ooS4.jpg


Old article

JP02rzR.jpg


Dc6qa48.jpg
 

cheezcake

Member
I wouldn't doubt the GTX570 could outperform the PS4 in BF4, problem is the PS4 has a more powerful GPU even ignoring code-to-metal optimization! (~1.4 tflops vs ~1.8 in the PS4) Battlefield 4 was a sloppy port job to consoles. Not saying a 780ti won't outclass a PS4, that is a serious video card. But there's no excuse for BF4's performance on console.

Different architectures so you cant accurately compare performance based on tflops, GTX 570 performs just between 7850 and 7870 iirc, which is what we expected PS4 to perform around based on the specs.
 
Ouch. Crushed blacks are just evil. Hard to understand why anyone would chose a multiplatform game on Xbox if they owned both consoles. There's just too much of a difference you simply shouldn't be seeing for two consoles released at the same time.


Strange comment considering the PS3 launched a full year later than 360, yet multiplats looked worse, either way this isn't like a new development, we've seen a lot worse, specifically between GameCube and Xbox, both launched at the same time and some multiplats ran completely different assets (see Splinter Cell) This is actually quite close. This revisionist history is becoming very tiring.
 

Mr Vast

Banned
I benchmarked my OC'ed GTX 570 + 2500k on battlefield 4 campaign and it actually slightly outperformed the PS4 on roughly equivalent settings. I'll see if I can dig up the graphs again but these consoles are quite underpowered.

A 780ti will smash the PS4/X1 throughout its lifetime.

a 760/280x will do that too...
 

Oemenia

Banned
And again no examples.
I've given You highest end title from 2011 thats running on higher specs than console version.
Where is Your example? Real example.

----

What? Even C2D+ 8800GT combo isnt 3 times more powerful, more like 2.1-2.3 and this PC is maybe slightly more powerful, like 10-20% than past gens.
4 times powerful hardware run past gen games in 1080p and 60fps on higher specs.
Those hardware came out long after the launch of the 360 and costed way more, only 2x is hardly an achievement which I think is conservative since you're not factoring the significantly more RAM.

Also he gave examples, I doubt people are going to be digging up old GPUs and uploading videos of games not launching and/or running poorly.

Are you seriously suggesitng thta my 780 ti ISN'T going to outperform a PS4 for the rest fo the gen?

And, please get off your cross. Nop onme her eis calling you a console "pleb", YOU are the only one bringing such terminology into the thread.
I really honestly didn't see somebody name dropping their hardware when talking about consoles, but don't worry, the honour of it is perfectly intact regardless of optimisation on consoles.

That GPU is not even close to 4x as powerful as the 360's GPU nor is that CPU.

I suggest you do not speak on subjects you have no knowledge on.
Nice of you to misquote me to prove my point (I said system, not GPU), I can see why I am the one with no knowledge especially since you have no idea what that an in-order CPU is significantly less powerful than a Conroe. I know John Carmack may be under the bus for his pro-console comments but his is what he said himself that the PS3/360 CPUs were signficantly less powerful than the high-end stuff on the market at the time. The developer of Braid added that a 360's CPU core only compares to a 1.4-6ghz P4 chip and we all know how efficient the Netburst technology was. Also you're welcome for teaching you that this sort knee-jerk reaction and dismissal of new console hardware tends to repeat itself.
 

Renekton

Member
I benchmarked my OC'ed GTX 570 + 2500k on battlefield 4 campaign and it actually slightly outperformed the PS4 on roughly equivalent settings. I'll see if I can dig up the graphs again but these consoles are quite underpowered.

A 780ti will smash the PS4/X1 throughout its lifetime.
Stock GTX570 slots right in between 7850 and 7870 according to Toms/Anand, same position as PS4 GPU. So, sounds about right when you OC.
 
Sony did not, AMD did!
The 8 ACEs are part of GCN1.1 (or what ever it is called) and thus any GCN1.1 GPU will have the same, it is not unique to the consoles.

We have statements from Mark Cerny and known AMD employees on Beyond3D that confirm the increase in ACEs was a Sony requested enhancement that was later migrated to the PC line of products. Sony's deal was structured such that AMD was free to incorporate advances developed in collaboration into other chips, except for the Xbox One APU.
 

Oemenia

Banned
Qassim said:
I think sometimes people mistake the 'better' graphics on consoles as they go through their generation as purely finding ways to do the same thing faster (e.g. say, Uncharted 1 would with all the same graphical features would run much faster in 5 years with those optimisations than it did at release). But the reality is, most of the optimisations are around better uses of the same resources. e.g. The Last of Us uses a less expensive AA method than Uncharted 1, GTAV has tighter limits when it comes to leaving bodies around (small example)).

I'd say the big gains in learning the system end after 2 years or so, the optimisations from there on are primarily of trying to find ways to reduce the use of resources on things that aren't as big of a visual pay-off (unfortunately often means good IQ towards the end of the generation).

A top end GPU from around the time of PS3 (e.g. the 8800GT) would run games at console settings or slightly higher all the way through the generation. As mentioned earlier, the difference this gen is that the top end PC GPUs are multiple times more powerful than the consoles, which wasn't the case last time.
Its one factor yes but I think people give too much credit to nips-and-tucks that might be seen as reductions. Around when Gears 3 launched, CliffyB said that Gears 1 levels would run at 500% the speed on the newest iteration of the engine. Even if you take his words with a pinch of salt, Judgement looked even better and had AA and that's just one series.
 

HTupolev

Member
Sony's deal was structured such that AMD was free to incorporate advances developed in collaboration into other chips, except for the Xbox One APU.
Dat PPE effect?

Its one factor yes but I think people give too much credit to nips-and-tucks that might be seen as reductions. Around when Gears 3 launched, CliffyB said that Gears 1 levels would run at 500% the speed on the newest iteration of the engine. Even if you take his words with a pinch of salt, Judgement looked even better and had AA and that's just one series.
I can't find that quote, and having played Gears 1 and a little way through Gears 3, I might have to take that with several universes worth of salt.
 

TheD

The Detective
Nice of you to misquote me to prove my point (I said system, not GPU), I can see why I am the one with no knowledge especially since you have no idea what that an in-order CPU is significantly less powerful than a Conroe. I know John Carmack may be under the bus for his pro-console comments but his is what he said himself that the PS3/360 CPUs were signficantly less powerful than the high-end stuff on the market at the time. The developer of Braid added that a 360's CPU core only compares to a 1.4-6ghz P4 chip and we all know how efficient the Netburst technology was. Also you're welcome for teaching you that this sort knee-jerk reaction and dismissal of new console hardware tends to repeat itself.

Grow up, I am well aware of the fact that the IPC of a Xenon core is not as high as a P4 or Core2 due to a number of reasons (including the fact that it is in order), but the performance the 3 cores has been said (by a developer) to be around 1.5x times that of a single P4 core running at the same clock speed and with the Core2 being about 2x the IPC of Netburst so that CPU in the video it is not 4x as fast.

This is also ignoring the fact that it is very possible to run the game with a slower CPU than that (i.e I was able to play Crysis 2 on a lower clocked C2D (2.2 GHz).
Hell, I just checked how it runs on my i5 by disabling 2 of it's cores and dropping the clock speed a ton (1.8Ghz) and it runs fine (so fine in fact that I think it would still run fine on a C2D at 1.8Ghz with it's lower IPC).

It is extremely clear that you have an agenda.

We have statements from Mark Cerny and known AMD employees on Beyond3D that confirm the increase in ACEs was a Sony requested enhancement that was later migrated to the PC line of products. Sony's deal was structured such that AMD was free to incorporate advances developed in collaboration into other chips, except for the Xbox One APU.

Ahh, I have been giving B3D a miss due to how crap it went when the new consoles got announced.

love how quick a console analysis turned into a console vs pc thread

Well someone asked what PC you would need to run Metro Redux at the same settings as the consoles versions and some people did not like the answers.
 

Renekton

Member
This is also ignoring the fact that it is very possible to run the game with a slower CPU than that (i.e I was able to play Crysis 2 on a lower clocked C2D (2.2 GHz).
Hell, I just checked how it runs on my i5 by disabling 2 of it's cores and dropping the clock speed a ton (1.8Ghz) and it runs fine (so fine in fact that I think it would still run fine on a C2D at 1.8Ghz with it's lower IPC).

It is extremely clear that you have an agenda.
We all have agendas, that's why we are here, including you and me.

I don't think 2.2Ghz C2D is good enough. I had a Q6600 and it struggled mightily on C2.

http://www.techspot.com/review/379-crysis-2-performance/page8.html
 

TheD

The Detective
We all have agendas, that's why we are here, including you and me.

My agenda was to look at Metro Redux's changes (which some people have been posting in this thread).

I don't think 2.2Ghz C2D is good enough. I had a Q6600 and it struggled mightily on C2.

http://www.techspot.com/review/379-crysis-2-performance/page8.html
Well Sandy Bridge's IPC is not massively higher than Conroe as shown in http://www.tomshardware.com/reviews/processor-architecture-benchmark,2974.html thus my 1.8Ghz Sandy Bridge testing can not be too far off a 2.2Ghz range C2D.

That test you linked to is also at the max quality settings which put a much larger drain on the CPU than the lowest settings, which the consoles run at (and do not forget that Crysis2 is not even close to a 30 FPS clock on consoles).
 

KKRT00

Member
Those hardware came out long after the launch of the 360 and costed way more, only 2x is hardly an achievement which I think is conservative since you're not factoring the significantly more RAM.

Also he gave examples, I doubt people are going to be digging up old GPUs and uploading videos of games not launching and/or running poorly.

2x is a combo with 8800 GT, for C2D+1950x its 1.1x, maybe 1.2x

No, he didnt give any examples. He didnt posted any video or benchmark confirming his theory (as always in this thread), while i posted freaking Crysis 2 running on shit hardware from 2005.
There are tons of examples from 8800 GT, that runs games 2x times faster than past gen consoles and if the whole optimization difference was true, it should run only slightly better or same.

----
I don't think 2.2Ghz C2D is good enough. I had a Q6600 and it struggled mightily on C2.
What?! This benchmark is from Extreme settings, when console version run on setting below lowest PC.
I've played through whole game on C2D 3ghz + R4850. And i was actually additionally frapsing it.
In Multiplayer i had 50+ fps with fraps on High/Gamer settings
https://www.youtube.com/watch?v=6AuFw9OeYBw&list=UULvq9KX4OlKxx5vA-DX3UqQ

SP on custom settings, mostly Advanced and some higher.
https://www.youtube.com/watch?v=H4501LB8Ysc&list=UULvq9KX4OlKxx5vA-DX3UqQ
 

gofreak

GAF's Bob Woodward
2x is a combo with 8800 GT, for C2D+1950x its 1.1x, maybe 1.2x

No, he didnt give any examples. He didnt posted any video or benchmark confirming his theory (as always in this thread),

The info's out there, but if you insist.

X1950 Pro with early gen vs late gen

Oblivion: 1600x1200/16xAF/Ultra High/43.8fps avg
Skyrim: 800x600/Low/20-29fps indoors (Later somewhat higher with a CPU upgrade, but not nearly Oblivion style perf)

Assassin's Creed: Max Settings/30fps avg
Assassin's Creed Black Flag: Does not meet minimum requirements

Battlefield 2: 1600x1200/High/50fps
Battlefield 2 Bad Company 2: 1280x1024/23-28fps ('with shooting')
Battlefield 4: Does not meet minimum requirements?

Call of Duty 2: 1600x1200/40.4fps
Call of Duty Ghosts: Does not meet minimum requirements

Are you contending that the x1950 pro maintained the same performance superiority vs the consoles across the whole gen as it had at the start? Because all I'm contending is that spec inflation was a thing and that a static relationship vs console performance couldn't be banked on. And that may or may not happen this time, I don't know, but that history might be something to bear in mind when looking at the DF article linked to earlier about current GPU perf vs the new consoles in launch games. Even the example of Crysis illustrates this - that GPU may have held up against the 360 in Crysis 2, but was completely deprecated in terms of support for Crysis 3.

This is kind of OT right now though. I'd be happy to talk about this over PM if you think this trend wasn't the case and have a wider sample of games to illustrate that.
 
The info's out there, but if you insist.

X1950 Pro with early gen vs late gen

Oblivion: 1600x1200/16xAF/Ultra High/43.8fps avg
Skyrim: 800x600/Low/20-29fps indoors (Later somewhat higher with a CPU upgrade, but not nearly Oblivion style perf)

Assassin's Creed: Max Settings/30fps avg
Assassin's Creed Black Flag: Does not meet minimum requirements

Battlefield 2: 1600x1200/High/50fps
Battlefield 2 Bad Company 2: 1280x1024/23-28fps ('with shooting')
Battlefield 4: Does not meet minimum requirements?

Call of Duty 2: 1600x1200/40.4fps
Call of Duty Ghosts: Does not meet minimum requirements

Are you contending that the x1950 pro maintained the same performance superiority vs the consoles across the whole gen as it had at the start? Because all I'm contending is that spec inflation was a thing and that a static relationship vs console performance couldn't be banked on. And that may or may not happen this time, I don't know, but that history might be something to bear in mind when looking at the DF article linked to earlier about current GPU perf vs the new consoles in launch games. Even the example of Crysis illustrates this - that GPU may have held up against the 360 in Crysis 2, but was completely deprecated in terms of support for Crysis 3.

This is kind of OT right now though. I'd be happy to talk about this over PM if you think this trend wasn't the case and have a wider sample of games to illustrate that.

Well the reason it did not meet minimum requirements for a number of games lategen is because the pc versions, somewhat arbitrarily in some cases, went completely DX10 or DX11 only. The PC versions could not even be forced to run at near console settings. x1950xt was a DX 9.0c card. SImilarly a number of games, like the call of duty ones, purposefully locked out cards that could run teh game because they were not added to a master list which the game referenced against.
 

KKRT00

Member
The info's out there, but if you insist.

X1950 Pro with early gen vs late gen

Oblivion: 1600x1200/16xAF/Ultra High/43.8fps avg
Skyrim: 800x600/Low/20-29fps indoors (Later somewhat higher with a CPU upgrade, but not nearly Oblivion style perf)

Assassin's Creed: Max Settings/30fps avg
Assassin's Creed Black Flag: Does not meet minimum requirements

Battlefield 2: 1600x1200/High/50fps
Battlefield 2 Bad Company 2: 1280x1024/23-28fps ('with shooting')
Battlefield 4: Does not meet minimum requirements?

Call of Duty 2: 1600x1200/40.4fps
Call of Duty Ghosts: Does not meet minimum requirements

Are you contending that the x1950 pro maintained the same performance superiority vs the consoles across the whole gen as it had at the start? Because all I'm contending is that spec inflation was a thing and that a static relationship vs console performance couldn't be banked on. And that may or may not happen this time, I don't know, but that history might be something to bear in mind when looking at the DF article linked to earlier about current GPU perf vs the new consoles in launch games. Even the example of Crysis illustrates this - that GPU may have held up against the 360 in Crysis 2, but was completely deprecated in terms of support for Crysis 3.

This is kind of OT right now though. I'd be happy to talk about this over PM if you think this trend wasn't the case and have a wider sample of games to illustrate that.
Are You kidding me? You really do not understand what You did wrong here?
 

Durante

Member
Are you contending that the x1950 pro maintained the same performance superiority vs the consoles across the whole gen as it had at the start?
You do realize that console-equivalent settings on PC around the start of the gen where close to "high", and dropped to below "low" in some titles towards the end of the generation?
 

TheD

The Detective
The info's out there, but if you insist.

X1950 Pro with early gen vs late gen

Oblivion: 1600x1200/16xAF/Ultra High/43.8fps avg
Skyrim: 800x600/Low/20-29fps indoors (Later somewhat higher with a CPU upgrade, but not nearly Oblivion style perf)

Assassin's Creed: Max Settings/30fps avg
Assassin's Creed Black Flag: Does not meet minimum requirements

Battlefield 2: 1600x1200/High/50fps
Battlefield 2 Bad Company 2: 1280x1024/23-28fps ('with shooting')
Battlefield 4: Does not meet minimum requirements?

Call of Duty 2: 1600x1200/40.4fps
Call of Duty Ghosts: Does not meet minimum requirements

Are you contending that the x1950 pro maintained the same performance superiority vs the consoles across the whole gen as it had at the start? Because all I'm contending is that spec inflation was a thing and that a static relationship vs console performance couldn't be banked on. And that may or may not happen this time, I don't know, but that history might be something to bear in mind when looking at the DF article linked to earlier about current GPU perf vs the new consoles in launch games. Even the example of Crysis illustrates this - that GPU may have held up against the 360 in Crysis 2, but was completely deprecated in terms of support for Crysis 3.

This is kind of OT right now though. I'd be happy to talk about this over PM if you think this trend wasn't the case and have a wider sample of games to illustrate that.

History is irrelevant, with the very clear slowing of Moore's Law you will not have close to as large performance increase over the same time span as last gen, thus lazy programming (Skyrim) and massive tech advances in games (thus cutting the low end) are not going to be acceptable as they had been.
So today's GPUs should last much longer (Including what you will need for console level Metro Redux performance).
 

gofreak

GAF's Bob Woodward
Well the reason it did not meet minimum requirements for a number of games lategen is because the pc versions, somewhat arbitrarily in some cases, went completely DX10 or DX11 only.

Right, I'm not saying that if devs honed in on the x1950 pro that they wouldn't or couldn't have continued to support it at console+ levels of perf, but the environment at the time meant devs were happy to simply ride spec inflation than decently support older gpus. The reasoning is immaterial to my point though - it happened. And even aside from abrupt support drops for DX9 hardware, there are examples above that illustrate degradation in performance vs a console as years advanced. Where it enjoyed brilliant performance vs 360 in Oblivion it struggles with Skyrim, for example.


You do realize that console-equivalent settings on PC around the start of the gen where close to "high", and dropped to below "low" in some titles towards the end of the generation?

Absolutely. But I'm not sure what that has to do with my point. My point is that that card did not hold up as well over the course of the gen as it was vs the consoles at the start. PC settings rose and rose beyond what consoles were offering, for sure, but you needed hardware to match. The x1950 wasn't keeping up with that rise, it went from comfortable outperformance in the examples above, to lesser outperformance, or underperformance or non-support entirely. Skyrim there is running at half the res of the 360 version on low (console equivalent) settings, while its earlier gen incarnation was pelting away at console equivalent settings at much higher res.

I didn't think 'spec inflation' in PC games last gen was such a controversial subject. I thought it was widely accepted to be 'a thing', but that the debate around this was whether it would carry forward or not this gen.
 

KKRT00

Member
. Skyrim there is running at half the res of the 360 version on low (console equivalent) settings, while its earlier gen incarnation was pelting away at console equivalent settings at much higher res.
Its because of CPU, not GPU.
Skyrim is very CPU dependent game.

If You really want find examples of games released after 2010, You need to find modern gpu with similar performance bracket as x1950.

And there is no 'spec inflation'. 8800 GT/GTS tests, that are countless on youtube and there are even some geforce benchmarks for ME 3 and C2, are best examples of that. This card in 2006 outperformed consoles by factor of two, in 2011/2012, it still outperformed consoles by factor of 2.
 
Right, I'm not saying that if devs honed in on the x1950 pro that they wouldn't or couldn't have continued to support it at console+ levels of perf, but the environment at the time meant devs were happy to simply ride spec inflation than decently support older gpus. The reasoning is immaterial to my point though - it happened. And even aside from abrupt support drops for DX9 hardware, there are examples above that illustrate degradation in performance vs a console as years advanced. Where it enjoyed brilliant performance vs 360 in Oblivion it struggles with Skyrim, for example.

That has nothing to do with the GPU though, more has to do with the horribly horribly older than dirt CPU...
 
  • PS4 : 1920x1080 = 2073600 pixels
  • XB1 : 1620x912 = 1477440 pixels

That's more than slight difference

It's a huge difference numerically, but it is a slight difference VISUALLY. To some people anyway, to others it's a huge difference visually, although I tend to doubt their claims of jaw-dropping differences between 900p upscaled to 1080p and 1080p native. It's noticeable for sure, and side-by-side I can almost instantly point out a PS4 multiplat when next to an XB1 multiplat. In fact it's usually easier to just look for the darker image and assume it's the XB1 version than to look for the crisper image, because the darkness of XB1 blacks just jumps out at ya more. So crushed. But slight is a subjective term as many have said, and to dogpile on someone (OP or other) for calling it a slight difference is absurd.
 
Its because of CPU, not GPU.
Skyrim is very CPU dependent game.

If You really want find examples of games released after 2010, You need to find modern gpu with similar performance bracket as x1950.

And there is no 'spec inflation'. 8800 GT/GTS tests, that are countless on youtube and there are even some geforce benchmarks for ME 3 and C2, are best examples of that. This card in 2006 outperformed consoles by factor of two, in 2011/2012, it still outperformed consoles by factor of 2.

I don't think you can compare the 8800 cards cause the are literally a generation ahead of the gpus in the older consoles. Especially the 8800GT which was quite a bit better than the GTS cards.
 

Oemenia

Banned
You do realize that console-equivalent settings on PC around the start of the gen where close to "high", and dropped to below "low" in some titles towards the end of the generation?
The definitions were very different then though. Since consoles became lead platforms Low simply means the base (mostly slightly worse than consoles).

Also I'm on mobile so I can't quote the poster but let's not forget that the 8800 GT came out mid-2007 at £200, calling it 2x more powerful is being extremely conservative when in pure numbers it runs circles around the Xenos.
 
The definitions were very different then though. Since consoles became lead platforms Low simply means the base (mostly slightly worse than consoles).

Also I'm on mobile so I can't quote the poster but let's not forget that the 8800 GT came out mid-2007 at £200, calling it 2x more powerful is being extremely conservative when in pure numbers it runs circles around the Xenos.

Actually in games like Crysis 2 and 3, low means still much much higher than last gen consoles. You can test it quite easily by loading up the "xbox 360 config."

You cannot just post baseless conjecture concerning things... please post the settings of the games you mean...
 

gofreak

GAF's Bob Woodward
Its because of CPU, not GPU.
Skyrim is very CPU dependent game.

If You really want find examples of games released after 2010, You need to find modern gpu with similar performance bracket as x1950.

And there is no 'spec inflation'. 8800 GT/GTS tests

The 8800 GT and GTS are a year newer than the x1950 (two years newer than Xenos), and to one extent or another more powerful. 400-600Gflops GPUs with better bandwidth etc. They're not X1950 Pros, your initial card of choice in this.

I'm not arguing that more powerful newer cards wouldn't ride out the gen better. Or, indeed, that adding better CPUs or more memory won't extend the life of a GPU and maintain relative perf better. Perhaps beyond a certain level of perf increase over the consoles, the relationship vs the consoles over the gen was more static? Maybe. But these are different parameters.

I'm not even arguing that this will happen this gen.
 

KKRT00

Member
I don't think you can compare the 8800 cards cause the are literally a generation ahead of the gpus in the older consoles. Especially the 8800GT which was quite a bit better than the GTS cards.

Up to 10% better. There are examples for both, but GT is more common on youtube and in benchmarks.
Architecture was released in 2006 and we are talking about games released 6 years after, so why its not viable? Because it had unified architecture and streaming processors?
---
Also I'm on mobile so I can't quote the poster but let's not forget that the 8800 GT came out mid-2007 at £200, calling it 2x more powerful is being extremely conservative when in pure numbers it runs circles around the Xenos.
8800 GT is 10% faster than GTS that came out in 2006, so dont know whats Your point.

If You can provide examples of older GPU on youtube or in bechmarks that i'll be all for it, but because You cant, You have to use similar performance brackets.
 

Durante

Member
The point in this thread was originally about GPUs. It has generally the case that the performance advantage of equivalent hardware in consoles on the CPU side was much higher than the GPU side - at least with DirectX9. And it has to be, since console CPUs suck so much.

Absolutely. But I'm not sure what that has to do with my point.
The way you present it could make it seem like there was a decline when it was in fact just the definition of words changing.

Oh, and your Skyrim example seems to be one of the versions before Bethesda could be arsed to add -O3 to their compilation flags. People like to complain about Ubisoft, but that right there is still one of the most damning examples of a developer not giving a shit.
 
Up to 10% better. There are examples for both, but GT is more common on youtube and in benchmarks.
Architecture was released in 2006 and we are talking about games released 6 years after, so why its not viable? Because it had unified architecture and streaming processors?

An attribute which makes it more directly comparable to the Xenos GPU.

If its performance relative to "console" settings remained static, then it proves the point fully.

The point in this thread was originally about GPUs. It has generally the case that the performance advantage of equivalent hardware in consoles on the CPU side was much higher than the GPU side - at least with DirectX9. And it has to be, since console CPUs suck so much.

The way you present it could make it seem like there was a decline when it was in fact just the definition of words changing.

EDIT: I have access to an 8800 GT, should I down clock it to lower "console" like clocks like 500 mhz and run some tests? LOL

Oh, and your Skyrim example seems to be one of the versions before Bethesda could be arsed to add -O3 to their compilation flags.

His skyrim example is also using a low clocked Pentium 4 for the CPU, a CPU which came out 2 years before the xbox 360.

EDIT: I have access to an 8800 GT, should I down clock it to lower "console" like clocks like 500 mhz and run some tests? LOL
 

Renekton

Member
Well Sandy Bridge's IPC is not massively higher than Conroe as shown in http://www.tomshardware.com/reviews/processor-architecture-benchmark,2974.html thus my 1.8Ghz Sandy Bridge testing can not be too far off a 2.2Ghz range C2D.

That test you linked to is also at the max quality settings which put a much larger drain on the CPU than the lowest settings, which the consoles run at (and do not forget that Crysis2 is not even close to a 30 FPS clock on consoles).
Hmm I believe even Conroe/Kentsfield to Nehalem alone was a big difference in terms of game performance for clock. I dumped the Q6600 something fierce (due to inferior stepping) to i7-920 and even on stock the difference is big especially on Blizzard games and Civ5.
 
Hmm I believe even Conroe/Kentsfield to Nehalem alone was a big difference in terms of game performance for clock. I dumped the Q6600 something fierce (due to inferior stepping) to i7-920 and even on stock the difference is big especially on Blizzard games and Civ5.

I can personally attest to the massive difference in clock per clock performance between conroe and nehalem. Went from a Q6600 @ 3.2 to a Core i7 930 @ 2.97 and I was no longer CPU bottlenecked at all. This is without factoring in HT (the game which bottlenecked me was BFBC2 and DOWII at the time).
 

gofreak

GAF's Bob Woodward
The way you present it could make it seem like there was a decline when it was in fact just the definition of words changing.

Do you think that a x1950 Pro with a better CPU would be able to play Skyrim at the same 'console equivalent' settings, same res, and same fps, as it does Oblivion? If it doesn't I'm not sure what to call that but a decline in how it holds up vs the consoles. I can't find another video of Skyrim on a x1950, but if someone can I'm all ears.

Maybe someone should build that PC and test it out. Or for this gen, preserve a 'console equivalent' 2013 PC and see how it runs things in 5 years compared to that DF article now.
 

dark10x

Digital Foundry pixel pusher
e.g. The Last of Us uses a less expensive AA method than Uncharted 1, GTAV has tighter limits when it comes to leaving bodies around (small example)).
Uncharted 2, however, used superior AA, ran at a much smoother frame-rate, featured object motion blur, and was overall much more detailed.

It wasn't until UC3 that they moved to post-process AA.
 

KKRT00

Member
Do you think that a x1950 Pro with a better CPU would be able to play Skyrim at the same 'console equivalent' settings, same res, and same fps, as it does Oblivion? If it doesn't I'm not sure what to call that but a decline in how it holds up vs the consoles. I can't find another video of Skyrim on a x1950, but if someone can I'm all ears.
First of all Oblivion was unoptimized on console and yes Skyrim should run slightly better than Xbox 360 with better CPU.

Maybe someone should build that PC and test it out. Or for this gen, preserve a 'console equivalent' 2013 PC and see how it runs things in 5 years compared to that DF article now.

Go for it, we'll wait. Second is DF already covering with their R 270 PC.
 
Top Bottom