• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Metro Redux (Console Analysis)

TheD

The Detective
I can personally attest to the massive difference in clock per clock performance between conroe and nehalem. Went from a Q6600 @ 3.2 to a Core i7 930 @ 2.97 and I was no longer CPU bottlenecked at all. This is without factoring in HT (the game which bottlenecked me was BFBC2 and DOWII at the time).

http://www.anandtech.com/bench/product/49?vs=46
Not a massive difference at all (note that the q9400 is running 0.2Ghz less than your Q6600, but is close IPC wise as shown in http://www.anandtech.com/bench/product/75?vs=53)
 

Darkroronoa

Member
Aren't 1950's benches have no point because they dont have unified shader architecture? Xbox was advanced when it came out.

Now modern GPU's have the same architecture as consoles so these GPU's should last the generation.
I mean an 8600gt or an ati 3650 can play most modern games that dont need dx11 (cause they dont support it) with performance pretty close to a console and they supposed to have similar performance with consoles (judging by flops and bandwidth, i know this is not all but i had an 3650 and i could play multiplatform games with similar settings).
 

gofreak

GAF's Bob Woodward
First of all Oblivion was unoptimized on console and yes Skyrim should run slightly better than Xbox 360 with better CPU.

If you're contending there's a static relationship in its perf vs consoles it should handily outperform it as it did in Oblivion, not just slightly outperform. So that's what I'm wondering - where it played Oblivion at 1600x1200/40+fps/UltraHigh, will it play Skyrim at 1600x1200/40+fps/Low?

Every early console game is arguably not very well optimised for it vs later waves. That's part of the point really.

Go for it, we'll wait. Second is DF already covering with their R 270 PC.

I'm afraid I don't have the time or the money :) Re. DF though, what I mean is that it would be very interesting for them to come back to this PC in 5 years and do an end-of-gen article to compare against their beginning-of-gen one.
 
Absolutely. But I'm not sure what that has to do with my point. My point is that that card did not hold up as well over the course of the gen as it was vs the consoles at the start. PC settings rose and rose beyond what consoles were offering, for sure, but you needed hardware to match.

Of course you did, because PC hardware evolves constantly. What you don't understand is that this is not an issue at all for gamers, since by the time this increase in visual fidelity happens you can buy a new graphics card that will get the job done for the price of a video game, at most a collector's edition one. There is no reason to stay with an ancient graphics card when more power is available so cheaply. Developers know that and they design their games accordingly. If PC game settings had remained the same throughout the generation I believe that the X1950 would be able to keep up just fine.
 

coastel

Member
Aren't 1950's benches have no point because they dont have unified shader architecture? Xbox was advanced when it came out.

Now modern GPU's have the same architecture as consoles so these GPU's should last the generation.
I mean an 8600gt or an ati 3650 can play most modern games that dont need dx11 (cause they dont support it) with performance pretty close to a console and they supposed to have similar performance with consoles (judging by flops and bandwidth, i know this is not all but i had an 3650 and i could play multiplatform games with similar settings).

Im no tech expert just curious but isnt it the same with ps4 being different to that it can use its pool of ram for cpu tasks to no PCs do this saybe wont be an good comparison again. Please correct me if im wrong id like to understand the differences. I dont believe this 2 times more power optimisation but surely there is some benefit from closed platforms.
 

EGM1966

Member
It's a huge difference numerically, but it is a slight difference VISUALLY. To some people anyway, to others it's a huge difference visually, although I tend to doubt their claims of jaw-dropping differences between 900p upscaled to 1080p and 1080p native. It's noticeable for sure, and side-by-side I can almost instantly point out a PS4 multiplat when next to an XB1 multiplat. In fact it's usually easier to just look for the darker image and assume it's the XB1 version than to look for the crisper image, because the darkness of XB1 blacks just jumps out at ya more. So crushed. But slight is a subjective term as many have said, and to dogpile on someone (OP or other) for calling it a slight difference is absurd.
Personally I find the use of slight in a technical thread discussing empirical data comparison to describe more than 25% differences in values absurd.

In OT, when discussing your own view on how much (or little) it matters to you? Sure - but there are other threads for that. As others have noted too people in tech threads need to stick to facts or provide more rational for their subjective views.

Objectively there is a moderate to notable resolution difference between the two. Not slight.

It's just not a subjective kind of thread at the end of the day.
 

TheD

The Detective
Im no tech expert just curious but isnt it the same with ps4 being different to that it can use its pool of ram for cpu tasks to no PCs do this saybe wont be an good comparison again. Please correct me if im wrong id like to understand the differences. I dont believe this 2 times more power optimisation but surely there is some benefit from closed platforms.

RAM is Random Access Memory, it does not compute anything.
 

gofreak

GAF's Bob Woodward
Of course you did, because PC hardware evolves constantly. What you don't understand is that this is not an issue at all for gamers, since by the time this increase in visual fidelity happens you can buy a new graphics card that will get the job done for the price of a video game, at most a collector's edition one.

I fully understand the ability and willingness of people to upgrade, and how this feeds in to developer decisions about specification requirements and recommendations. I'm saying x happens, you're saying it happens because of xyz... and I don't disagree. There's lots of reasons why it happened, why one game might have been poorly optimised, why dx9 support was dropped, why a dev bumped min specs between franchise iterations for convenience etc. etc. I'm just saying it happened - whatever the reasoning - where others earlier seemed to contend that it did not and that the perf delta between a pc and console was and would be reliably static across a generation. In subsequent debate what I've drawn is that it's a case of, at least, 'it depends'.
 

Raist

Banned
To maybe expand on that, and make it a little more precise.

//=====================

Here are the histograms for the 2nd comparison pic in the zoomed comparison.
PS4 top, XB1 bottom.

u5HwNf2.png


(The big spike on the left side of the PS4 distribution is also present on the XB1 image, it's just sitting on the left border because I framed these images really poorly.)

The PS4 image contents look very similar to XB1's, except compressed within a range that looks suspiciously like the bounds for the "LDR" content in limited-range RGB. There's a little bit of image content outside of that range; some of this is the "PS4" text on the image, the rest is pretty small and looks as though it's not much more than jpeg-related histogram bleeding.

Images 1 and 4 in the comparison have a similar thing going on.

//=====================

Images 3 and 5 are different. Image 3, the one with the snowy car, has this histogram, which looks more similar between PS4 and XB1:

CCFzLSO.png


//=====================

tl;dr DigitalFoundry probably captured some of the PS4 images from an RGB Limited source with an RGB Full receiver.

Wouldn't be the first time DF is utterly confused when it comes to limited/full settings. Their original defense for the XB1 crushed blacks issue made no sense whatsoever.
 
Do you think that a x1950 Pro with a better CPU would be able to play Skyrim at the same 'console equivalent' settings, same res, and same fps, as it does Oblivion? If it doesn't I'm not sure what to call that but a decline in how it holds up vs the consoles. I can't find another video of Skyrim on a x1950, but if someone can I'm all ears.

Maybe someone should build that PC and test it out. Or for this gen, preserve a 'console equivalent' 2013 PC and see how it runs things in 5 years compared to that DF article now.




There's something you seem to never take into account.
And it's a pretty much important one... Xbox 360 GPU was something NEWER than PC GPUs at the time.
As for the CPU, you're comparing it to a Pentium 4. Sure, Pentium 4 was faster core per core... But Pentium 4 was only ONE core.
While PS4 was maybe not using a newer GPU, but still using a high end GPU at the time. Plus taking into account the CPU.
Both were advanced machines for their time.
They had advantage over PC, in term of cores or architecture.


But nowadays ?
Not only those are using off the shelve parts, those components aren't even high end. There was no way to find something 3 to 4 times more powerfull than an Xbox 360 at its release. The same could be said of PS3. But as for PS4 ? You already have 3 times faster GPUs. And not 1000 dollars ones, but 250 dollars ones.
It is already easy to find cheap GPUs that outperforms in term of rough specs the one you find in consoles. Which wasn't the case back in 2005/2006.
 

gofreak

GAF's Bob Woodward
There's something you seem to never take into account.

Throughout this discussion I said that I had no idea if this would or wouldn't repeat into the future. You're presenting reasons why this may not repeat, and I agree. Things could be different this gen for a variety of reasons. I was merely saying it did happen in response to a poster who said that 'console optimisation never happens in reality' and pointed to an article about launch games this gen on PC vs console. I was contending that if (relative) console optimisation 'never happened', the relationship between a PC of similar hw vs consoles should be static across a gen, and that didn't seem to me to be the case last gen, so be cautious about drawing too many conclusions from launch games. Maybe it would be different this gen, maybe not. That's all!
 
lol, considering the 780Ti isn't even 3 times as fast and costs $700...

Do not just look at the TFs of the 780Ti, Nvidia Flops are in general much more "per-value" than AMD flops.

1,84 AMD TFs (PS4 GPU) x 3 = 5,52 TFs

a 290X is 5,6 TFs

a 290x is quite similar performing in most applications to a 780Ti which is 5,04 Nvidia TFs.
 

low-G

Member
Do not just look at the TFs of the 780Ti, Nvidia Flops are in general much more "per-value" than AMD flops.

1,84 AMD TFs (PS4 GPU) x 3 = 5,52 TFs

a 290X is 5,6 TFs

a 290x is quite similar performing in most applications to a 780Ti which is 5,04 Nvidia TFs.

Except consoles don't rely on drivers. floating point operations = floating point operations.

Either way, just disproving the foolish notion that such a card could be had for $250.
 

coastel

Member
Rich coming from someone that has no idea what RAM is.

What has this got to do with absurd comments. Im in the process of looking at parts for a PC, even some one like me can see it was a silly comparison. Also my comment on the ram thanks for being informative you would rather degrade some one. My point about the ps4s build im sure i read it some where that cpu tasks can be helped I stated im no tech expert so would of liked to know what it was about.
 
Except consoles don't rely on drivers. floating point operations = floating point operations..

Consoles do not have a 1:1 translation of flops either. They have APIs and "drivers" too, it is not as if console programming is all done in assembly or something.

But yes, to say that you can get 5.5 TF for $250 is not smart.
 

thelastword

Banned
Personally I find the use of slight in a technical thread discussing empirical data comparison to describe more than 25% differences in values absurd.

In OT, when discussing your own view on how much (or little) it matters to you? Sure - but there are other threads for that. As others have noted too people in tech threads need to stick to facts or provide more rational for their subjective views.

Objectively there is a moderate to notable resolution difference between the two. Not slight.

It's just not a subjective kind of thread at the end of the day.
You are absolutely right but of course you're only speaking to the clouds because that's not what some persons want to hear. Look at how the pc guys have overtaken this thread when it was a ps4 to xbox comparison in the first place. Albeit... with a pc screenshot with no reference.

It's par for the course in these threads really, some persons will try to downplay the differences between the two console versions by using terms like "it's only a slight difference", "I can't tell the difference between 900p and 1080p", "ohhh.. the average Joe won't know the difference". Why do we always have persons coming into a tech thread to announce that they can't tell a difference? Perhaps, if you can't tell a difference then tech threads are just not for you because all seems grey to you anyway....

An image at native rez will always look superior to an upscaled image there's no two bones about it, if you don't mind the blurrier look, that's fine. The game is still intact with a very solid performance on the xbone, if that's all you have to play it on, fine. If it's your console of choice, that's fine too, but there's no reason to deny the tangible differences afforded by a higher pixel count. Textures appear sharper, in this case, there's less shimmering, picking up on detail is all that much better overall.

On the flipside it's really an injustice for some persons to come to a thread and do a comparison that's not fair or justified. Comparing a high rez image (which we have no reference to) to a blurry youtube video of the console version is wrong. It detracts from very solid work done by a studio who's gone to hell and highwater to get these console versions at a very remarkable performance level.
 

SmokedMeat

Gamer™
Great to see the Metro Redux CONSOLE comparison thread has evolved 100% to PC > consoles talk.

Now let's see if the Future PC thread gets shitted up with console discussion. My money says not at all.
 
What the hell happened in this thread?

I really have no idea. I posted some screens comparing the lack of volumetrics and then somehow someone said something along the lines of "the volumetrics do not look too good anyhow" or "other games do them better." I have no idea, but eitherway, we now have this.

Great to see the Metro Redux CONSOLE comparison thread has evolved 100% to PC > consoles talk.

Now let's see if the Future PC thread gets shitted up with console discussion. My money says not at all.

Relax man, not everyone here is being venomous. The discussion has been quite good.
 
You already have 3 times faster GPUs. And not 1000 dollars ones, but 250 dollars ones.

Doesn't the 780Ti cost ~2x PS4 and is only like ~2x the power?

Doesn't matter anyway, price/performance wise nothing can beat the PS4. They explicitly chose to go down this route. Last gen they went for power and costs spiraled out of control. Not to mention, the shiny advanced PS360 were outdated within 2 years after release and the only thing MS/Sony got was expensive consoles not selling well, heating issues thanks to those exotic parts that led to broken consoles. It all led to last gen being longer than usual because platform holders needed longer times to get their investment back.

This gen we have a 200 dollar cheaper console that's selling extremely well, will get cheaper faster because of its components and is extremely dev friendly.
 
I looked at this thread when it was on page 1 and thought, "Ah, cool". Now I come back, it's on page 10, and I don't understand a freaking word of it. I need some lessons. Shit.
 
Can we talk about how bad it is that DF only offers a limited number of compressed JPGs in its comparisons?

Would it be so hard to ask for PNGs... or just in general more awareness towards IQ?

Or similarly... properly uncompressed/low compression videos at native framerate?
 

Kezen

Banned
My apologies...I have my fair share of responsability in the path taken by the thread.
Sorry, I just asked a question. I didn't know it would lead to this.
 
My apologies...I have my fair share of responsability in the path taken by the thread.
Sorry, I just asked a question. I didn't know it would lead to this.

Haha, that is true actually, you asked about the PC GPU required to get a PS4 like experience :p
 
I fully understand the ability and willingness of people to upgrade, and how this feeds in to developer decisions about specification requirements and recommendations. I'm saying x happens, you're saying it happens because of xyz... and I don't disagree.

The reason why this happens is basically the whole point of the discussion, no? PC gamers upgrade their PCs because they want and can do so, console gamers don't upgrade their consoles because they can't. PC developers know that the hardware baseline is raised with every passing year and make sure to offer better graphics as time progresses. Eventually you will need to upgrade your PC, not because console optimizations raised the minimum requirements but because PC devs did in response to their audience's buying habits.

So it is true that a 2005 PC might not run the PC version of, say, Far Cry 3 as well as the 360 will run the 360 version. Not because of the 360 coding to the metal and such, but because the PC version contains advanced graphical effects not seen on the console version.
 

coastel

Member
Haha, that is true actually, you asked about the PC GPU required to get a PS4 like experience :p

LOL the PC image may not of helped in a thread about a console comparison though it was nice seeing what was missing so meh. Looks like a good remaster compared to the old gen consoles and at 60fps the few things we may of lost would of been worth it. So glad i have never played any of these and cant wait to with better performance than what I would of had with the ps360.
 
Just to get back on topic, I believe that the developers made the right call by sticking to 60fps instead of improving the graphics. Framerate über alles!
 

Daemul

Member
Just to get back on topic, I believe that the developers made the right call by sticking to 60fps instead of improving the graphics. Framerate über alles!

Hopefully with the PS5/XB4 devs won't have to choose, they'll be able to give us console gamers both stunning graphics with all the effects AND 60fps. Believe!
 

Kezen

Banned
Hopefully with the PS5/XB4 devs won't have to choose, they'll be able to give us console gamers both stunning graphics with all the effects AND 60fps. Believe!

They will always have to chose. No matter your hardware a game targetting 30 will always look better than a game targetting 60.
 

Oemenia

Banned
Actually in games like Crysis 2 and 3, low means still much much higher than last gen consoles. You can test it quite easily by loading up the "xbox 360 config."

You cannot just post baseless conjecture concerning things... please post the settings of the games you mean...
Please allow the strawmen and ad homs, it doesn't do this discussion any favours. That was one game off the top of my head, but another I can think of is Alan Wake. If you follow DF then they regularly say console settings tend to be a bit higher than Low or at last on par.

And using your own Crysis 2 example, Crysis 3 runs and looks better on console invalidating your own point. I can assure you that nothing I say in this thread is meant to be taken personally (or against PC gaming generally).

8800 GT is 10% faster than GTS that came out in 2006, so dont know whats Your point.
Which means it was based on architecture that's a year newer which was fully DX10 compliant. Then we factor in that its a conservative 2x more powerful (the difference in the numbers is even bigger) you're looking at a chip that's simply in another league.
 

On Demand

Banned
I don't like the downplaying of the differences this generation. Last generation any little thing the 360 did better was enough to call it superior and make everyone go with that version. This time were talking whole resolution differences and now it's "they look the same" i can't see a difference anyway" "resolution isn't everything."


Pls.
 
I don't like the downplaying of the differences this generation. Last generation any little thing the 360 did better was enough to call it superior and make everyone go with that version. This time were talking whole resolution differences and now it's "they look the same" i can't see a difference anyway" "resolution isn't everything."


Pls.

The differences are less pronounced so far. Last gen there were massive resolution differences(usually going far below 720P), entire effects removed, and on top of that the PS3 versions ran worse as well. It improved as it went along but it was rough at first. The Fear release on PS3 is a good example. That's still the case at times with the PS4 and One, but it's not as bad as that example, the Darkness, or many others.
 

KKRT00

Member
P
Which means it was based on architecture that's a year newer which was fully DX10 compliant. Then we factor in that its a conservative 2x more powerful (the difference in the numbers is even bigger) you're looking at a chip that's simply in another league.
No, numbers arent bigger than 2x and games do not run similarly, but two times better, so there is no degradation of performance even after 6 years.
Is this so hard to grasp?

---
And using your own Crysis 2 example, Crysis 3 runs and looks better on console invalidating your own point. I can assure you that nothing I say in this thread is meant to be taken personally (or against PC gaming generally).
But it also looks better on similar PC.
 
but another I can think of is Alan Wake.

And using your own Crysis 2 example, Crysis 3 runs and looks better on console invalidating your own point. I can assure you that nothing I say in this thread is meant to be taken personally (or against PC gaming generally).
.

Alan wake, some settings on the PC some medium or high are equiv to xbox 360. Of course, a ev interview confirms this. But the xbox 360 build also runs at 960 X 540... so there are many mitigating factors.

I dont understand what you mean with your point concerning Crysis 2 and 3 that I highlighted. How does what you say disprove my point about "low" on PC having higher CVAR values than the xbox 360?
 
I don't like the downplaying of the differences this generation. Last generation any little thing the 360 did better was enough to call it superior and make everyone go with that version. This time were talking whole resolution differences and now it's "they look the same" i can't see a difference anyway" "resolution isn't everything."


Pls.

You are right, sadly it's a common occurrence. During the last 3-4 years, when both previous gen consoles lagged far behind PC in quality, resolution and framerate, the differences (which were often 3x the resolution and double the framerate) were often dismissed by quite a few Xbox and Playstation owners because "it's essentially the same game". Now a difference of 30% is considered huge because it's between the Xbox and the Playstation. Very few gamers can claim innocence when it comes to applying double standards, all sides have done it.
 
Top Bottom