• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry - Metro Redux (Console Analysis)

Kezen

Banned
The problem is a lack of exactness when people talk about this. (greatly exacerbated by the age of twitter communication and its byte-sized quotables)

When you run a given HLSL shader on a console GPU and an equivalent PC GPU, they'll perform the same. If the PC GPU is twice as fast the shader will run twice as fast. How could it be different? The same shader compiler made by the same company is compiling it, and the same hardware is running it. Now, game developers may spend more time low-level optimizing a shader specifically for a single console GPU than they do for every PC GPU, but how much of a difference such hardware-specific optimizations really make greatly depends on the situation, and it's a huge time investment.

Thanks for the response. I was not factoring in the cost of the optimization on the developper side, they may chose not to dive too deep this time around.
 

Handy Fake

Member
FCDlIse.jpg

Fun fact: once you've cleared the area, if you stand underneath that light and throw a bottle into the air, the shadow casts across all the buildings.

Great fun.
 
One version of the game being better is a completely objective thing. This is known threw actual testing of the game. If you don't care about the differences that's absolutely fine. To pretend they are equal is just factually wrong.
 

Kezen

Banned
Artifacting yes, but TLoU and Uncharted did it well without inducing much of it, especially the Remaster where it's pretty much non existent. Also volumetric lighting should look soft because it's basically light interacting with dust particles and dust particles give it a soft edge, I always had a personal dislike of how sharp it looked in Metro 2033.

In TLoU, all enemy flash lights are volumetric, plus a lot of lights.
EWnWjmY.jpg

Example of enemy flash light interacting with player.
http://i.cubeupload.com/hfYtmz.jpg

Here you can see the shadow it produces despite the light being off screen (screen space light shafts don't produce shadow shafts...or whatever they are called). This isn't the best shot that I could take in this location but it's apparent near his head.
YfnQgbh.jpg

pi5rPLu.jpg

FCDlIse.jpg


http://i.imgur.com/uFsFTkt.jpg
http://i.imgur.com/qJrWTjC.jpg

Any artifacts you see (like Joel's T shirt in the last pic, is due to Facebook)

This looks really bad.
 
Those are quite "sharp" volumetrics (the edges of the beam, not the internal shadows)... are you sure you dislike the ones in metro?

Interesting to see, thanks for the pictures. It is a shame the images are of such low quality and very few show the actually shadow beams (object intersection).
 

nOoblet16

Member
This looks really bad.

I don't see it.

I've seen far worse implementation and you also have to give it credit for being a last gen game, they did not upgrade the post processing effect or the shader quality in the remaster so it essentially is PS3 quality at higher resolution. (the lights I mean)

Those are quite "sharp" volumetrics (the edges of the beam, not the internal shadows)... are you sure you dislike the ones in metro?

Interesting to see, thanks for the pictures. It is a shame the images are of such low quality and very few show the actually shadow beams (object intersection).

I haven't seen 2033 recently, but I remember not liking it as much as Crysis 1 back in the day precisely because of how sharp they looked.
The TLoU lights have sharp edges but the light volume itself looks quite soft which is what bothered me the most with Metro.

And I'd take better pictures if I could but PS4 share button has a lot of lag when taking screenshots so I miss the shadows when it's there but all of them do cast the shadow beams.
 
Compare that GPU vs a 360 with 2005/2006 era titles vs 2011+ era multiplats. It would have chewed Oblivion up but the same couldn't be said for Skyrim, for example. I'm not sure it would even run some of the latest stuff. Would have been fine with CoD 2006, but Ghosts? Assassin's Creed (2007), but Black Flag?

Not many people have x1800-x1950 gen card lying around any longer...

Will this do?
https://www.youtube.com/watch?v=0vgIRTdXdio
 

nOoblet16

Member
It's bad compared to Metro:

Metro%20Last%20Light%206.jpg

I wasn't comparing it to Metro, my point was the game ran it on PS3, also the look is kind of stylized (much like Alan Wake and the alien ship in Crysis).
Plus I'm not sure if the lighting in that particular shot is really volumetric, it could be baked shafts as well (the game does use them too). I won't be able to tell unless there's any sort of object intersection or see it in movement.
 
I wasn't comparing it to Metro, my point was the game ran it on PS3.
Plus I'm not sure if that's really volumetric, it could be baked shafts as well (the game does use them too). I won't be able to tell unless there's any sort of object intersection.

That is from LL and I am 100% sure it is a baked one. LL is full of them.

2033:
43110_2014-08-11_00006gjli.png

43110_2014-08-11_0000f3k0r.png

etc...

They have a penumbra effect within the shadow intersection. They start sharp and end diffuse.
 

Oemenia

Banned
Compare that GPU vs a 360 with 2005/2006 era titles vs 2011+ era multiplats. It would have chewed Oblivion up but the same couldn't be said for Skyrim, for example. I'm not sure it would even run some of the latest stuff. Would have been fine with CoD 2006, but Ghosts? Assassin's Creed (2007), but Black Flag?

'Spec inflation' on multiplats and console ports as the last gen worn was a thing. Maybe it won't be this gen, but I wouldn't be terribly confident about betting on that.
Some of the more zealous posters have left, you wont be getting a reply for speaking too much sense.

When you run a given HLSL shader on a console GPU and an equivalent PC GPU, they'll perform the same. If the PC GPU is twice as fast the shader will run twice as fast. How could it be different? The same shader compiler made by the same company is compiling it, and the same hardware is running it. Now, game developers may spend more time low-level optimizing a shader specifically for a single console GPU than they do for every PC GPU, but how much of a difference such hardware-specific optimizations really make greatly depends on the situation, and it's a huge time investment.
Thanks for confirming this, with so much money on consoles and them being lead platforms, I think they will.
 

MarkV

Member
Volumetric lights in the original Metro 2033 are great but they are also heavy, Killzone SF IMO has probably the best implementation yet because VL in that game looks good and they are also very fast.

Uncharted 3 and TLOU did a very good job with them too on the previous gen.
 
Compare that GPU vs a 360 with 2005/2006 era titles vs 2011+ era multiplats. It would have chewed Oblivion up but the same couldn't be said for Skyrim, for example. I'm not sure it would even run some of the latest stuff. Would have been fine with CoD 2006, but Ghosts? Assassin's Creed (2007), but Black Flag?

'Spec inflation' on multiplats and console ports as the last gen worn was a thing. Maybe it won't be this gen, but I wouldn't be terribly confident about betting on that.
Some of the more zealous posters have left, you wont be getting a reply for speaking too much sense.



8600GT Asscreed 3[/
8600GT Battlefield 3
8600GT Tombraider 2013
8600 GT w/ Black ops

For example
 

gofreak

GAF's Bob Woodward
Not many people have x1800-x1950 gen card lying around any longer...

Will this do?
https://www.youtube.com/watch?v=0vgIRTdXdio

That card isn't even nearly as powerful as a 360 though, no? So it might not be as good a comparison even if it's newer. It possibly still illustrates the trend though...I wonder how it fared vs console in BF2 then 3 then 4.

Unfortunately I can't find much info on x1950 with BF3...however I don't think it meets the requirements for BF4 at all. You can see videos of someone playing Skyrim with a X1950, but it's at half the res of the console version. Checking other sites I'm not sure the X1950 met minimum requirements for some of the other games mentioned.

On the other hand, the last two videos you link to above are actually of a 8800GTS...depending on which GTS that is, it's to one or other degree not very equivalent to console hw for the opposite reason as the 8600GT.

However, would you contend that any of these GPUs fared as well against consoles last year as they did in 2006/7?

Maybe the same trends won't manifest this gen, maybe it won't be as long a gen either, who knows. Anyway, this is probably a broader discussion than Metro...
 

Shin-Ra

Junior Member
Maybe because it wasn't an issue? Where did you see black crush?

Darker, yes. Crushed, no.
It's crushed, John.

iT8MxDru3GYRL.png


edit: But there's also this, which isn't a fault of the game or PS4 video output.

I checked the image's luminance histograms. Aside from the "PS4" on the image (which is pure white), the data is all within the 16-240 region. It looks like DF was interpreting limited-range incoming video data as full range.

So yeah, the newer images are messed up. The data should have been presented over a larger luminance range.
 

RoboPlato

I'd be in the dick
Oh boy, I see the console OPTIMIZATIONS!#%$!~ BS is still alive and well....

It's a real thing, the problem is that the term is horribly exaggerated by one side and overly downplayed by the other to the point where these arguments are worthless. Same thing happens when arguing diminishing returns. The reality is that it varies a substantial amount from process to process, game to game, and developer to developer to the point where very few people that aren't developers on a specific game can really weigh in anything related to it.

I also don't know why it's really a big deal at all in this thread. These games are running at 1080/60 on console hardware and only had a couple of the most taxing effects pared back a bit. It's very impressive for a first release from a small studio on these platforms. Those effects probably would have been in if it was a 30fps release or if they were willing to tolerate more framedrops but 4A went for a locked 60.
 

nOoblet16

Member
That is from LL and I am 100% sure it is a baked one. LL is full of them.

2033:
43110_2014-08-11_00006gjli.png

43110_2014-08-11_0000f3k0r.png

etc...

They have a penumbra effect within the shadow intersection. They start sharp and end diffuse.

Now that's really good especially due the diffuse the distortion from mask helps too). I must mention that I never really finished 2033 as I didn't like the game much so obviously missed out on seeing a lot of areas. (The outdoors as in out in the open looked trash compared to indoors though, mostly down to aesthetics and some weird textures)
 

HTupolev

Member
edit: But there's also this, which isn't a fault of the game or PS4 video output.
To maybe expand on that, and make it a little more precise.

//=====================

Here are the histograms for the 2nd comparison pic in the zoomed comparison.
PS4 top, XB1 bottom.

u5HwNf2.png


(The big spike on the left side of the PS4 distribution is also present on the XB1 image, it's just sitting on the left border because I framed these images really poorly.)

The PS4 image contents look very similar to XB1's, except compressed within a range that looks suspiciously like the bounds for the "LDR" content in limited-range RGB. There's a little bit of image content outside of that range; some of this is the "PS4" text on the image, the rest is pretty small and looks as though it's not much more than jpeg-related histogram bleeding.

Images 1 and 4 in the comparison have a similar thing going on.

//=====================

Images 3 and 5 are different. Image 3, the one with the snowy car, has this histogram, which looks more similar between PS4 and XB1:

CCFzLSO.png


//=====================

tl;dr DigitalFoundry probably captured some of the PS4 images from an RGB Limited source with an RGB Full receiver.
 

mintylurb

Member
It's crushed, John.

iT8MxDru3GYRL.png


edit: But there's also this, which isn't a fault of the game or PS4 video output.

Now, why would DF capture PS4 images differently this time around? Considering DF's reputation as the bastion of objectively and truth, I doubt they would intentionally soften the PS4 images to make it seem the difference between 912P vs 1080P isn't as quite as big as raw data suggests.
Wish Rich would post..maybe dark10x can shed some light into this?
 

thelastword

Banned
I really don't know why the pc guys are in here comparing a full rez pc shot to a low bitrate video shot on the console side. Volumetric lighting on the PC is very taxing and what we don't know of that comparison shot is at what rez and framerate it's running and on what PC (specwise), at what price? What we do know, is that the PS4 is running this at 60fps at 1080p. There are too many unknowns on the pc side of this comparison, so at this point it's a form of FUD from the detractors.
 

KKRT00

Member
Compare that GPU vs a 360 with 2005/2006 era titles vs 2011+ era multiplats. It would have chewed Oblivion up but the same couldn't be said for Skyrim, for example. I'm not sure it would even run some of the latest stuff. Would have been fine with CoD 2006, but Ghosts? Assassin's Creed (2007), but Black Flag?

'Spec inflation' on multiplats and console ports as the last gen worn was a thing. Maybe it won't be this gen, but I wouldn't be terribly confident about betting on that.

And again no examples.
I've given You highest end title from 2011 thats running on higher specs than console version.
Where is Your example? Real example.

----
Specs of the system in your vid:

Intel C2D E6500 2.93 ghz
ATi Radeon X1950
4gb RAM

That's hardware that's 3-4x more powerful than an 360 and its only on par? We haven't even factored in cost!
What? Even C2D+ 8800GT combo isnt 3 times more powerful, more like 2.1-2.3 and this PC is maybe slightly more powerful, like 10-20% than past gens.
4 times powerful hardware run past gen games in 1080p and 60fps on higher specs.
 

nOoblet16

Member
Those TLOU shots really don't look at all comparable to what was posted of Metro 2033 earlier in the thread, volumetric lighting wise. They could almost just be a few screen-aligned sprites.
I know that the discussion about it is over and I don't necessarily disagree with you but I had to correct you on it because they are not sprites (like in Mass Effect or countless other games) as they do have a volume, they are very similar to the ones in Alan Wake. They are moving light sources (flash lights, search lights etc) and they intersect with objects to create shadow beams. The quality is obviously not as good but it was meant for the PS3.
 

thelastword

Banned
Is there any word if this game is using the Kinect allotment that was freed up to devs?
Yes, it uses the kinect free allotment. Read below.

The link here
The original plan of developers was to ship Xbox One version of Metro: Redux at 900p, but this small improvement in resolution comes after June SDK update which frees up Kinect GPU resources to developer.


Even with that, you realize that it still tears on Xbox One. I hope you were'nt expecting the xbox to match the PS4 version like for like?
 
It's crushed, John.

iT8MxDru3GYRL.png


edit: But there's also this, which isn't a fault of the game or PS4 video output.


Ouch. Crushed blacks are just evil. Hard to understand why anyone would chose a multiplatform game on Xbox if they owned both consoles. There's just too much of a difference you simply shouldn't be seeing for two consoles released at the same time.
 

TheD

The Detective
This begs the question then, why did Sony went the extra mile and shoved 8 in the console's GPU ? There must be a good reason.
The R9 290X has 8 ACEs so I'm lead to believe this will greatly improve performance. My point is that the hardware required for 1:1 against the PS4 right now for multiplats will not cut it in the next 18 months or so. I can't believe developpers have really hit the ceiling just yet.

Sony did not, AMD did!
The 8 ACEs are part of GCN1.1 (or what ever it is called) and thus any GCN1.1 GPU will have the same, it is not unique to the consoles.

I don't see why it would be much different from previous generations. Heck I don't understand why the idea of console optimisation is such a problematic reality for some PC users.

Because it is not a "reality", not to the extent that console fanboys crap on about!

However what's with the mass denial of console optimisation from a section of the PC gamers here, after all those things are only trying to bring emulate the fixed hardware nature of consoles.

Specs of the system in your vid:

Intel C2D E6500 2.93 ghz
ATi Radeon X1950
4gb RAM

That's hardware that's 3-4x more powerful than an 360 and its only on par? We haven't even factored in cost!

Glad I'm not one of them.

That GPU is not even close to 4x as powerful as the 360's GPU nor is that CPU.

I suggest you do not speak on subjects you have no knowledge on.
 

Kinthalis

Banned
Yeah, Carmack was lying through his teeth when he suggested the exact opposite of what your stance entails. Who to believe?

Carmack was tlaking about a very specific benchmark and DX9. NOT modern PC API's, NOT any benchmark that is useful in terma of rela world performance in a game. Draw calls is not what all a game engine does. He's made more relevant comments since then, BTW including the one from the latest Nvidia summit earlier this year in which he said in terms of real world performance there';s relaly not much difference. And optimization on consoles isn't going to create any sort of meaningful gap.

The overhead with current API's means that CPU's do need to be more powerful than a console's in order for CPU bottlenecking not to be an issue at console level settings/resolution. But that's going away, and doesn't do much to change things when CPU isn't the bottleneck.
 

evildose

Neo Member
Keep in mind there will be "Spartan" and "Survival" modes in both Metro:LL and 2033 Redux. "Spartan" plays more like original Last Light, "Survival" plays more like original 2033.
I was unaware of this. I'll definitely be getting this. Now I just have to decide on whether to get it on ps4 or pc.
 

Kinthalis

Banned
... or am I console pleb for speaking otherwise? The rhetoric has been used before might I assure you and time and time again its by the same people who swear their PC hardware will outperform consoles for the rest of the gen.

Are you seriously suggesitng thta my 780 ti ISN'T going to outperform a PS4 for the rest fo the gen?

And, please get off your cross. Nop onme her eis calling you a console "pleb", YOU are the only one bringing such terminology into the thread.
 
Are you seriously suggesitng thta my 780 ti ISN'T going to outperform a PS4 for the rest fo the gen?

And, please get off your cross. Nop onme her eis calling you a console "pleb", YOU are the only one bringing such terminology into the thread.

As much as think PCs are superior to consoles for gaming, I don't think that your 780ti will out perform a PS4 in, say, 5 years time. Think about it like this, it's like saying that a GeForce 7950 GT would out perform the PS3 and 360 for their whole generation. That said, a 780ti will probably out perform next-gen consoles for multi-plat games for 4-5 years
 

Caayn

Member
As much as think PCs are superior to consoles for gaming, I don't think that your 780ti will out perform a PS4 in, say, 5 years time. Think about it like this, it's like saying that a GeForce 7950 GT would out perform the PS3 and 360 for their whole generation. That said, a 780ti will probably out perform next-gen consoles for multi-plat games for 4-5 years
Your comparison is faulty. The consoles aren't powered by high-end GPUs this time, instead they're equipped with low-end ones. A 780ti is multiple times faster than what's in the PS4 and Xbox One, it'll last the generation just fine while out performing the consoles.
 

Darknight

Member
Looks great but will wait till it drops lower. Black Friday cant come soon enough!

Odd 1080P image being downplayed in the first few pages. You guys realize that should be the standard for this gen, right? Not saying anything wrong about the xbone version but I wouldnt be arguing with someone who is using facts. (not me but those who were posting percentages)

Good job by the developer though. Getting 60FPS on both versions plus improving this game is awesome. It was a buggy game even on PC so its good they didnt just port it over.
 
Your comparison is faulty. The consoles aren't powered by high-end GPUs this time, instead they're equipped with low-end ones. A 780ti is multiple times faster than what's in the PS4 and Xbox One, it'll last the generation just fine while out performing the consoles.

This is true.
 

Durante

Member
I really don't know why the pc guys are in here comparing a full rez pc shot to a low bitrate video shot on the console side. Volumetric lighting on the PC is very taxing and what we don't know of that comparison shot is at what rez and framerate it's running and on what PC (specwise), at what price? What we do know, is that the PS4 is running this at 60fps at 1080p. There are too many unknowns on the pc side of this comparison, so at this point it's a form of FUD from the detractors.
What? People were pointing out that an effect is missing, even while noting clearly that the shots weren't to be taken for IQ comparison purposes.

Your comparison is faulty. The consoles aren't powered by high-end GPUs this time, instead they're equipped with low-end ones. A 780ti is multiple times faster than what's in the PS4 and Xbox One, it'll last the generation just fine while out performing the consoles.
Absolutely.
 

cheezcake

Member
As much as think PCs are superior to consoles for gaming, I don't think that your 780ti will out perform a PS4 in, say, 5 years time. Think about it like this, it's like saying that a GeForce 7950 GT would out perform the PS3 and 360 for their whole generation. That said, a 780ti will probably out perform next-gen consoles for multi-plat games for 4-5 years

I benchmarked my OC'ed GTX 570 + 2500k on battlefield 4 campaign and it actually slightly outperformed the PS4 on roughly equivalent settings. I'll see if I can dig up the graphs again but these consoles are quite underpowered.

A 780ti will smash the PS4/X1 throughout its lifetime.
 

low-G

Member
Just noticed in a comparison that the Redux version is missing the dappled, shimmering (color shifting) lights in Metro 2033. Even the Xbox 360 & PS3 versions have that effect... Lots of lighting simplifications.
 
Top Bottom