Wow, this thread went to shit overnight
but I can't stay away when there are things that need to be corrected.
a Master Ninja said:
This is very similar to what I see on my set. My question is, how do you guys get such great pics of your tv screens? I have a pretty good digital camera and a great tv, I want some tips for taking good off-screen photos.
http://sr-388.net/pages/photo-guide/ should hopefully help. (I never got around to finishing the guide, but the first two parts should at least give you a few pointers)
Onix said:
Seriously though, part of the problem with the comparison pics is the contrast and saturation bias the native 360 signal has.
What's making the pic not look 'wet' is in part due to lowered contrast (the highlights appear washed out in comparison).
I've done my best to match the levels on these images without spending a lot of time on it to get the PS3/360 output looking as similar as possible. Note: as I was doing it quickly, it resulted in a darker image overall.
I did not touch the sharpness etc, only the levels and saturation. Can we please stop arguing that it's a contrast/saturation/levels issue?
deepbrown said:
Uh no. DMC4 uses temporal aliasing - ie. blurring frames on the PS3 to create "fake" AA. The results can be seen in screenshots, but the results are hardly visible in movement at all - the same crispness is visible in both versions. I have experienced both versions of the game.
It should be visible on any good display. Most LCDs blur so much with motion, however, that you probably wouldn't have noticed it on one. I don't know how visible it would have been on Plasmas or DLP displays, but it sure was obvious on my CRT monitor. (which can display everything up to and including 1080p)
mintylurb said:
/me scratches head, again. See, it's not quite that blurry on my TV as you can see from my screenshots posted a few pages back.
The direct captures that have been posted in this topic are the only way to properly judge sharpness of the image really. Comparing photos of one person's TV to another doesn't work, as you have to factor in the display type, resolution, settings, camera type, resolution, settings etc.
Comparing photos taken by the same person on the same display works, but it's not as good as direct captures and there can be some things that will affect how both photos look. (whether exposure is locked for example)
What you can say though, is that however much sharper the PS3 version looks on your display compared to the screenshots, is also how much sharper the 360 version would look on your display compared to the screenshots.
stuburns said:
Name a single game that is sharper on PS3 then 360 (must be multiplatform of course), and provide screenshots. Direct-feed, showing some sort of HUD detail that shows the platform. And provide links to an official source so I know they aren't fixed.
I bet you can't find a single example.
The PS3's HDMI output is just as sharp as the 360s HDMI output.
Any differences between games is purely a software difference. You could say that it is caused by the hardware in the sense that the PS3's graphics card seems to be less capable than the one inside the 360, which results in developers having to lower image quality, but there is nothing about the way it outputs 720p, 1080p etc. that inherently makes it produce a softer image.
In the case of Bioshock, the developers seem to have implemented a blur filter, presumably as a way to fake anti-aliasing. (this technique was more common last generation)
In the case of a game like DMC4, they used temporal anti-aliasing on the PS3 rather than the proper anti-aliasing use on the 360 version of the game, instead of using none at all. (personally I would have preferred to see none)
In the case of games like GTA IV, Dark Sector etc. the PS3 version of the game runs at a lower resolution than the 360 version, which causes the image to appear softer. (the same thing applies to MGS4 as well, for example)
As for people that seem to prefer the softer image of Bioshock running on PS3 compared to the 360I hate you.
It is very easy to blur an image on your TV. You could switch to using component rather than HDMI, lower your sharpness settings, use negative sharpness if your display supports it, use overscan rather than disabling it etc. (note: on some displays like Pioneer/Toshiba, negative settings are not actually negative sharpness and they should be set to the lowest value)
It is absolutely impossible to recover the detail that has been lost by this blur filter, however. The sharpness controls on your display (which should really be disabled) only enhances the edge contrast of the image and almost always add artefacts to the edges of objects. (halos etc.) They cannot actually add any more detail/sharpness to the picture.