• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Playstation aliasing curse is over ?

I was blown away by Dreamcast´s Soul Calibur IQ!.

dreamcast IQ was so sharp

bb8249423afb298b0678ad97abff57c720120501185957.jpg
 
Yeah you're right dark10x, man that shit looks fantastic.
 
I told the tech director on Twitter to recommend this AA to everyone. He said he already was and several others, like Ubisoft, are using it already.
 
As long as the resolution fits my screen's native resolution, I don't care if it's aliased or anti-aliased. On my gaming Laptop I always turn off AA in games that run above an average of 30fps in 1080p. Particularly -- Telltale's games look worse with AA on. The colors lose some vibrance and it feels like the AA hinders the attention to detail. Same goes for the effect used in Mass Effect 3. Seriously, that game looks better without it.
 
So many people seemed to believe that Gamecube, Xbox, and Dreamcast all utilized anti-aliasing of some form to combat aliasing when, in reality, 99% of the games on those three platforms did not employ any sort of anti-aliasing at all. They simply renedered at a consistently higher resolution (640x480 usually).

On PS2, there was really no standard and developers used all sorts of crazy display methods to achieve results including field rendering (alternating rapidly between odd and even scanlines when drawing the image). Also, PS2 games typically lacked mip-mapping (similar to Sega Model 3) and distant textures appeared more shimmery. Still, as a result of these issues, we saw more games on PS2 operate at 60 fps versus the other consoles. That said, a lot of games actually DID run internally in essentially a progressive scan mode without native 480p output from the console itself. MGS2, for instance, was a native 512x448 that could be output progressively using a specific disc.

Those other consoles typically stuck to a more standard output resolution with heavy usage of mip-maps in addition to superior flicker filtering for interlaced images. There really weren't many (if any) games that used multi-sampling or anything along those lines, however.
Not sure but didn't the Xbox use Quincunx in many cases? I believe that's why some games like Burnout 3 had somewhat blurrier textures compared to the PS2 version.
 
dreamcast IQ was so sharp

bb8249423afb298b0678ad97abff57c720120501185957.jpg
Err, that's not an actual image from a Dreamcast.

Here's what it looks like on a VGA box. Notice the stair stepping (aliasing) and aggressive mip-mapping. It was still fantastic for a 1998 console but that shot you posted is super-sampled and inaccurate.

ibfaibGrKpvU6J.jpg

iHWKe2SEZmyb5.jpg


Not sure but didn't the Xbox use Quincunx in many cases? I believe that's why some games like Burnout 3 had somewhat blurrier textures compared to the PS2 version.
Nope. It did not use Quincunx.

Take a look at Burnout Revenge. Plenty of aliasing on Xbox.

snap014.jpg


PS2 version was blurrier but had better motion blur.

zcPn3rh.png
 
If you go through the dozens of infamous pics in the console screenshot thread you can still see jaggies here and there, they aren't dominating the picture though, they are rare but it is questionable that you can just tack the aa solution in that game onto another game and have the same iq. You can never see into the distance in infamous and have a sharp picture since the dof is always present. The aa might have a harder time in a game that doesn't blur the background/far away objects.
 
Regarding the original poster, I really do think the type of display you use combined with how that display is setup makes a HUGE difference in terms of image quality. Viewing distance, of course, also comes into play.

Putting your face up to an LCD monitor is always going to produce the most offensive results when it comes to image quality. Cleaning up an image in such a situation demands obscene amounts of anti-aliasing. I hate it.

Now, view that same image on a high-end plasma display from a more reasonable distance (say 4-6 ft) and you'll find everything appears cleaner and the overall image is more attractive. You can get away with less aggressive AA.

Now, on a CRT, screw it, you don't really need to do much to the image to achieve a ridiculously clean image.

Ultra rigid LCD sub-pixels exacerbate aliasing to the extreme and when they lack any other image quality benefits you're pretty much forced to crank up AA. Even downscaling isn't a great option as it produces subtle imperfections in the image that becomes much more noticeable when viewed in close proximity on an LCD.

I see... so even on Xbox, games like DOA3 just used the flicker filter?
You got it!

DOA3

snap012.jpg

snap020.jpg
 
I agree. The first game will always be my favorite.
Yes, the pic brought back memories, such a fun game, I did not stop till I beat the game from beginning to end, so addictive. I like a nicely anti-aliased game myself, but if you're not going to give me good aa, give me a sharp image over a blurry fxaa or quincunx. Infamous one was sharp, the effects shined because of it and it was one of the few games where I wasn't bothered by the lack of aa, however if they do re-make them, good aa, 60fps and 1080p would be sublime.
 
So many people seemed to believe that Gamecube, Xbox, and Dreamcast all utilized anti-aliasing of some form to combat aliasing when, in reality, 99% of the games on those three platforms did not employ any sort of anti-aliasing at all. They simply renedered at a consistently higher resolution (640x480 usually).

On PS2, there was really no standard and developers used all sorts of crazy display methods to achieve results including field rendering (alternating rapidly between odd and even scanlines when drawing the image). Also, PS2 games typically lacked mip-mapping (similar to Sega Model 3) and distant textures appeared more shimmery. Still, as a result of these issues, we saw more games on PS2 operate at 60 fps versus the other consoles. That said, a lot of games actually DID run internally in essentially a progressive scan mode without native 480p output from the console itself. MGS2, for instance, was a native 512x448 that could be output progressively using a specific disc.

Those other consoles typically stuck to a more standard output resolution with heavy usage of mip-maps in addition to superior flicker filtering for interlaced images. There really weren't many (if any) games that used multi-sampling or anything along those lines, however.
That misconception is never going to die at this point (and it's not the only one).

Lower resolutions in many games, interlacing issues and lack of mip-mapping (due to broken mip-mapping support on the PS2) are the usual problems on the system, not the lack of AA when basically no game on any system at the time was using MSAA.

Lack of mip-mapping may be the biggest issue in most cases, resulting in lots of shimmering/texture aliasing. Just look at Final Fantasy X for a good example.
 
Top Bottom