I've begun to agree with OP. Texture filtering is a big part of it, but now I prefer to run older games at their original resolutions or close to it.
When I played Quake for the first time last year I did everything I could to make it look as close to 1996 software Quake as possible. I turned off texture filtering (more games and systems need to provide this option), made the characters animate like they did in 96, etc. I didn't go all the way down to 320 x 240 because a window that size is way too small at 1080p but I settled for like 1024 x 768 or something. When I play Doom now I play it in Chocolate Doom or Crispy Doom in a window. I don't think going low res is a must though. Taking away texture filtering and adding scanlines if possible goes a long way.
I think games made around the early 2000's represent the borderline where the art assets look "good enough" at high resolutions. A lot of PS2 era games look great in 1080p or 4k. Still, there are PS3/360 era games that immediately show blemishes when run at anything above 720p (the textures in Dead Space 1).
When I played Quake for the first time last year I did everything I could to make it look as close to 1996 software Quake as possible. I turned off texture filtering (more games and systems need to provide this option), made the characters animate like they did in 96, etc. I didn't go all the way down to 320 x 240 because a window that size is way too small at 1080p but I settled for like 1024 x 768 or something. When I play Doom now I play it in Chocolate Doom or Crispy Doom in a window. I don't think going low res is a must though. Taking away texture filtering and adding scanlines if possible goes a long way.
I think games made around the early 2000's represent the borderline where the art assets look "good enough" at high resolutions. A lot of PS2 era games look great in 1080p or 4k. Still, there are PS3/360 era games that immediately show blemishes when run at anything above 720p (the textures in Dead Space 1).