Res Analysis Thread

#1
Alright, I can start a new thread now (no longer a junior yay :D )

Since there're many people who just hate seeing crap like this in their beloved game thread, I'd keep them here from now on.

And here you go the first game, Naruto Ultimate Ninja Storm 2!




PS3




360


* These shots are originally from ps360's blog. For more comparison shots, please visit his blog at http://blog.livedoor.jp/ps360/


When a 360 game runs low res, it's almost always related to its 10MB EDRAM limits. But such is not the case with Naruto UNS 2 since the game uses no AA at all (no tiling required), and that means the lowered running res is purely for its performance issues. The ps360's blog provides some short frame rate analysis clips for both versions, and even there, PS3 runs slightly better. This is a very interesting specimen of a multi platform game, since we're not talking about Naughty Dog, or Sony Santa Monica Studio here. This is Bandai Namco, a mere Japanese third party company to begin with! No one would expect some crazy performance enhancing SPU coding from a Japanese company, certainly not from a third party multi platform game developer. I'd suppose the most of the rendering is done on GPU in UNS 2, and it is commonly known how PS3's RSX is helpless against 360's Xenos without the help from the mighty SPUs. A well optimized RSX code could put Xenos on its knees? Probably.

.

.

.

.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
#7
Disgusting console resolutions. Uh. The vertical resolution isn't even four digits.
 
#10
what is this x:y pixel ratio thing people get from zooming in on the edge of something? i can't figure out how that is telling you the resolution

.

.

.

.

.
 
#15
I'll blame it on being almost 2am, but how do you know to divide 1280 by x, to get K&L2's resolution? Where does 1280 come from in that case? Hmm, I don't really get where any of the numbers come from now after looking at it some more :lol
 
#16
Minamu said:
I'll blame it on being almost 2am, but how do you know to divide 1280 by x, to get K&L2's resolution? Where does 1280 come from in that case? Hmm, I don't really get where any of the numbers come from now after looking at it some more :lol
The image is being displayed at (scaled to) 720p, which is 1280 x 720
 

Y2Kev

TLG Fan Caretaker Est. 2009
#19
Okay, I'll be patrolling this vigilantly. Do: discuss technical details, resolution, consequences. Don't: whine about people discussing technical details, comment on how they don't matter or that you don't care, and so on.
 
#20
Y2Kev said:
Okay, I'll be patrolling this vigilantly. Do: discuss technical details, resolution, consequences. Don't: whine about people discussing technical details, comment on how they don't matter or that you don't care, and so on.
Sound like fair play to me. It'll be interesting to see what discussions come out of this thread.
 
#21
Did Namco ever discuss why they had so many variable resolutions and blur options in Tekken 6? I was really fascinated in how the game was being rendered in completely different ways depending on whether motion blur was on or off.
 
#23
The resolution difference in Naruto and Kane and Lynch 2 are pretty dang small in the big picture of things (which with the later isn't exactly high to begin with), so it's not that bad in comparison with each other.

Naruto looks great, especially in motion, but some AA would do wonders.
 
#25
Y2Kev said:
Okay, I'll be patrolling this vigilantly. Do: discuss technical details, resolution, consequences. Don't: whine about people discussing technical details, comment on how they don't matter or that you don't care, and so on.
Pshh got in here 5 minutes too late..

Well then, who's winning?

I really can't tell the difference between them.. my SD eyes are deformed..
 
#26
This dedicated pixel-counting thread is a great idea. Thanks MazingerDUDE.

I'm hoping it stays technical. No need for anyone to veer off into comparative discussions of ice/leaves/colors/grass in here.

Resolution and AA keeps it simple.
Differences in HDR precision would also be interesting.
 
#27
Ooh, it's like our own Digital Foundry discussion. Subscribing
Some nice comparison pics of RDR and FFXIII if possible. For some reason I have a hard time telling much of a resolution difference from the pics on DF.
 
#30
ghst said:
you should throw in 1080p pc versions just for fun. think of it as a benchmark.
Digital Foundry did this for Mass Effect 2 PC vs. Xbox 360 upscaled to 1080p and surprisingly the difference wasn't quite as significant as I expected.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
#31
DonMigs85 said:
Digital Foundry did this for Mass Effect 2 PC vs. Xbox 360 upscaled to 1080p and surprisingly the difference wasn't quite as significant as I expected.
Mass Effect 2 really shines on PC when you start cranking the AA.
 
#32
EatChildren said:
Disgusting console resolutions. Uh. The vertical resolution isn't even four digits.
lol seriously. 1024 x 576? are you fucking kidding me? that's really disgusting.
next-gen my ass!

maybe kane & lynch is using too many post-processing effects for consoles so they have to lower the res, who knows!? but yeah, still disgusting :p
 
#33
How much EDRAM would the next Xbox need to fit a full 1920 x 1080 buffer and at least 2x MSAA?
Also it was a sin on Sony's part to cut the ROPS and memory bus width in half for RSX over the full 7800 GTX.
 

ghst

thanks for the laugh
#34
DonMigs85 said:
Digital Foundry did this for Mass Effect 2 PC vs. Xbox 360 upscaled to 1080p and surprisingly the difference wasn't quite as significant as I expected.
i remember posting a screenshot of crysis in alan wake resolution. that was certainly significant.

it seems disingenuous at this point to only give the thread the bottom end of the options. i'd understand not throwing out 2500x1600 screenies just to rub it in, but 1080p is an accepted pc standard at this point.
 
#35
Great idea. I really hope it stays technical as well. I love Digital Foundry's work.


DonMigs85 said:
How much EDRAM would the next Xbox need to fit a full 1920 x 1080 buffer and at least 2x MSAA?
Also it was a sin on Sony's part to cut the ROPS and memory bus width in half for RSX over the full 7800 GTX.
I read that the 10MB EDRAM can handle 720p or 1080i... so the question is: what happened? Why are we seeing K&L2 sub 720p resolutions? Too complex when factoring high framerates?

I'm also wondering how much money was saved by nerfing the RSX.
 
#37
This is a resolution analysis thread. When you set a PC game to 1080p, that's what it is, no analysis needed. PC screens have no place in this thread.
 
#41
Also to keep in mind (from your Tekken 6 article), about how sometimes a lower resolution can still look nice and sharp: :p

High quality motion blur consumes a lot of performance. If you want something else than camera motion based blur, you have to save the motion vectors for each pixel, so the render target memory requirement rises as well. 88 (16-bit) would be enough for 2d screen space motion vectors, but for practical reasons you need an 8888 buffer.

1365x768 resolution 8888 color + 24S8 depth = 8386560 bytes, while 1024x576 resolution 8888 color + 24S8 depth + motion vectors = 7077888 bytes

Both configurations fit well inside the 10MB eDRAM. The 1024x576 is kind of a strange choice, as it's only around half the pixels of the 1365x768 and the cost of the blur filter comes nowhere close to the performance gained from the resolution decrease, and they are not eDRAM limited either. The resolution reduction itself is not something I consider strange, but a reduction this large means they have something else going on than just the motion blur. The better texture detail you are seeing could mean they have enabled anisotropic filtering for the lower resolution.
People need to keep in mind that in many cases, running at a lower (or non HD) resolution, doesn't automatically translate into blurry image.

Or like what happened with Final Fantasy XIII (if I remember correctly). The game ran (on 360) at a lower resolution than then PS3 version, but the good upscaling couple with how the menu, text and icons were 720p (like the PS3 version); the overall loss in quality was not that noticeable, since the small things like the text looked as sharp as the original version.
 
#44
DonMigs85 said:
How much EDRAM would the next Xbox need to fit a full 1920 x 1080 buffer and at least 2x MSAA?
Also it was a sin on Sony's part to cut the ROPS and memory bus width in half for RSX over the full 7800 GTX.
Even if MS puts enough EDRAM to accommodate that devs will just cram more post processing/lighting/particles/<graphics technique of the month> and most games will likely still be 30 fps.
 
#47
No_Style said:
Great idea. I really hope it stays technical as well. I love Digital Foundry's work.




I read that the 10MB EDRAM can handle 720p or 1080i... so the question is: what happened? Why are we seeing K&L2 sub 720p resolutions? Too complex when factoring high framerates?

I'm also wondering how much money was saved by nerfing the RSX.
Actually most 360 games are 720p or higher. If you want 'free' AA, it's extra work minimize the cost of tiling; but 720p without AA fits nicely into the eDRAM. The perfomance cost from tiling is to geometry processing for things that span more than one tile.
Every system is limited though, and the more there is to do the longer it takes. Lowering resolution is an obvious way to free up resources, and on 360 it allows for properly free MSAA without the extra geometry cost from tiling. That's why most of the multi-platform games with weird resolutions have MSAA on 360 but none on PS3.
 
#48
For those interested in knowing how the 'pixel counting' works, this might help you understand.




And no, it doesn't take more than a minute to pixel count in most of the games. :lol

.

.

.
 
#49
MickeyKnox said:
Compare and contrast.
The only way it could have something to do with the topic at hand is if the PC game used the most console-like settings available and was set to 720p, thereby letting you see an approximation of the impact of sub-HD in a console game. Was that what you had in mind?