• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

High Image Quality and Resolution makes (some) old games look horrible

RedSwirl

Junior Member
I've begun to agree with OP. Texture filtering is a big part of it, but now I prefer to run older games at their original resolutions or close to it.

When I played Quake for the first time last year I did everything I could to make it look as close to 1996 software Quake as possible. I turned off texture filtering (more games and systems need to provide this option), made the characters animate like they did in 96, etc. I didn't go all the way down to 320 x 240 because a window that size is way too small at 1080p but I settled for like 1024 x 768 or something. When I play Doom now I play it in Chocolate Doom or Crispy Doom in a window. I don't think going low res is a must though. Taking away texture filtering and adding scanlines if possible goes a long way.

I think games made around the early 2000's represent the borderline where the art assets look "good enough" at high resolutions. A lot of PS2 era games look great in 1080p or 4k. Still, there are PS3/360 era games that immediately show blemishes when run at anything above 720p (the textures in Dead Space 1).
 

sn00zer

Member
Because you're stretching a 1080p image out onto a 1440p display. Mathematically that would mean that one pixel of that 1080p picture would have to occupy 1 and ⅓ of a pixel on your monitor which is impossible, so your GPU or monitor has to fudge it by scaling the image.
Ah is this why 4K scaling works well with 1080p?
 

renzolama

Member
Dunno, I think high-res Quake and Doom look great. I've always favored high-res, clean, smooth images, and I'm not bothered by texture stretching to match, so I guess it's just a personal artistic preference. I can certainly see the validity of the argument OP makes though.
 

Lkr

Member
no idea what is going on in this thread. being able to run games at higher resolutions is what PC gamers live for
 

Orayn

Member
TVs don't necessarily perform integer scaling for 1080p -> 4K. Some earlier 4K sets specifically touted it but I imagine it would be trivial to show that most don't.
 
Yeah, I might be weird for having this opinion, but I prefer the latter for some reason. :) Consistency maybe?

I don't even entirely disagree in this case.
Both images look pretty horrible. The emulator shot partly because of poor IQ and JPG compression (and possibly emulation errors; most N64 emulators have or had problems with wrong texture tiling rates, iirc). But of course also because of the pixelated HUD.
The 320x240 image already is incredibly blurry of course, but yeah, consistency between 2D/3D elements can be seen as a plus.

Personally I would prefer a high resolution image, but possibly with the addition of CRT shader that would soften the contrast between 2D and 3D elements and make it look more consistent.


I still want to stress one aspect: The quality of textured polygons with texture filtering does not get worse when rendering resolution is increased. It will either stay the very same or reveal more details if the lower resolution was too low to show them.
 
static images are awful comparison tools

what you think is 'gritty' is actually an eyesore once you put it in motion

maybe there's an argument for low resolution + tons of AA, if you want gritty
but AA is a type of image quality
 
Yes, a 1080p image scales cleanly to 4K because one pixel is simply enlarged to a 4 pixel block. Same goes for 720p to 1440p.

Does integer scaling have any benefits when downsampling for purposes of enhancing game image quality? Or it's only relevant when upscaling?
 

nded

Member
Does integer scaling have any benefits when downsampling for purposes of enhancing game image quality? Or it's only relevant when upscaling?

Somewhat, though you'd only really notice it if your downscaling process somehow didn't filter the image at all.

Unfiltered downsample from source w/ 4x the pixels:
ZgsD30L.png

Unfiltered downsample from source w/ 4.8x the pixels:
AjfCQDw.png

As you might be able to tell, the one downscaled from the 4x source has nicer looking diagonals and curves while the other one looks a bit jumbled in places as pixels groupings don't fall into place quite as neatly. Proper filtering will mitigate a lot of these problems, though you might notice that a perfect integer downsample would introduce slightly less blurring as the scaler doesn't have to fudge as much.
 

optimiss

Junior Member
Yeah, I might be weird for having this opinion, but I prefer the latter for some reason. :) Consistency maybe?

I agree with you. At the original resolution there is at least some cohesiveness to the presentation. The high res one feels so disjointed and lifeless.

The second image is also poorly upscaled, I can see ringing from the sharpening on the upscale method which is muddying up the visuals even more. High resolutions displays actually display/upscale low res art (320x240, for example) better than medium resolution art (720x480, for example) because the giant pixels lead to a greater percentage of the scene being a solid color where the upscaling isn't evident.
 
The second image is also poorly upscaled, I can see ringing from the sharpening on the upscale method which is muddying up the visuals even more. High resolutions displays actually display/upscale low res art (320x240, for example) better than medium resolution art (720x480, for example) because the giant pixels lead to a greater percentage of the scene being a solid color where the upscaling isn't evident.

The upscale filter was Lanczos which can lead to ringing, yeah. I don't like it either, but I wanted to use a filter that was not too blurry for the comparison.

Another quick and dirty comparison using the emulator shot as a base (far from perfect this way, but i didn't want to bother finding something better):

1. 320x240 native (downscaled)
2. 1024x768 nearest neighbor upscale
3. 1024x768 gaussian upscale
4. 1024x768 native
5. 1024x768 native with CRT shadow mask shader


Pick your poison. ;)

Personally I like the CRT shader shot best, even though it isn't tuned and could look better.
 

univbee

Member
Also just a note, 1440p isn't "2k".

Not quite. Real 2K is mainly a resolution used in the movie industry and for cinema projection purposes which isn't really available at the consumer level. It's 2048x1536 and then the image is optically stretched as required for the aspect ratio of the film in question (so resolution doesn't change but the pixels get really rectangular). This is the highest resolution offered for all 3D films, and is also the resolution that pretty much all CGI back when movies were shot on film stock were done in, and the vast majority of "4K" movie content including UHD Blu-rays are upscaled from a 2K master.
 
Top Bottom