I'm a professional rendering artist in the industry and here's a couple of my thoughts:
The general public, even most hardcore PC gamers don't really know what they're talking about because they're basing their opinion on current game renderings, current AA tech, current realtime shaders, and current effects. 720p PS3 games look more realistic than the PS2 HD 1080p remastered games and can do things the PS2 games can't in terms of enemies on screen etc.
What ultimately is going to happen is it's going to depend on the game, the director, and the artists - just like it always has. You'll get games like Shadow of the Colossus that opt for a new level of realism and accurate physics at the expense of resolution and framerate. And You'll get games like God of War with higher frame rates and resolution, but simpler rendering.
Yet a game like ut99 or cs is infinitely much more playable than the 'realistic' killzone 2 or uncharted 2 because the horrendous input lag on both those realistic games makes it feel like you are wading through a sea of jelly and/or are extremely drunk.
For any game where you actually have to be in control and react to things (so basically anything but slow RTS games and turn based games) you NEED 60 fps to make it feel right.
Nothing can possibly make up for a game that doesn't control right, nothing.
I'll take 1998 graphics with no discernable input lag over 2011 graphics with 100-200ms input lag, and so will anyone who actually wants to
play videogames instead of only look at videogames.
No matter how good a game (subjectively) looks, input lag is NOT ok.
If you think that that initial first hour impression of a game looking a bit better is worth sacrificing the gameplay feel than you have your priorities mixed.
I started out playing battlefield 3 mp on my pc on ultra as it gave me an average of about 60 fps (with drops to the high 30s), I was impressed by the graphics for about an hour or 2.
By the time I hit the last rank in the game I had gradually lowered my settings to never drop under 60 fps because the feeling of increased input lag whenever the framerate dropped lingered on LONG after the graphics wow factor was gone and was ruining the feel of being in direct control of the game.
I was already playing it on a crt monitor (so I already had 20-60 ms input lag less at all times than if I would be playing it on an LCD monitor -and possibly scaling the image on an lcd monitor) with a wired mouse, and even on that ideal set up the input lag when you get to 30 fps was ruining the feel of the game.
I can't even imagine wanting to play the game at 30 fps on an lcd tv that has to upscale and process the image adding another 100+++ ms latency).
I really think that most people who are ok with it just don't know any better, if they knew what 1:1 no delay input without anything messing with your timing felt like they would not be ok with it at all.