The notion that configuring one's graphics settings is complicated has got to be one of the most ill-informed positions I've ever heard. Do you people not dress yourselves? Because I can't imagine how a few sliders and words like "on" and "off" could be difficult to interpret. If you don't understand something, you can still reasonably assume that turning it off will make your game run better whereas turning it on will make it run worse. The only remotely difficult decision to make is Vsync and the change is often so drastic that it's not difficult to form an opinion about it after a couple of minutes.
The argument that this has anything to do with multiple hardware configs and optimization is ignoring the fact that preference plays into performance any time a user is forced to make a tradeoff between performance and image quality, which is almost always. I don't tolerate FPSes below 60 or input lag of any kind in any multiplayer game, so I'll take image tearing and blockiness if that's what it takes to get it. Right now, if the developer decides to sacrifice framerate for image quality (which they do frequently), that's a loss for me. Having the option to change that can only benefit me and can't possibly hurt you.
The argument that this has anything to do with multiple hardware configs and optimization is ignoring the fact that preference plays into performance any time a user is forced to make a tradeoff between performance and image quality, which is almost always. I don't tolerate FPSes below 60 or input lag of any kind in any multiplayer game, so I'll take image tearing and blockiness if that's what it takes to get it. Right now, if the developer decides to sacrifice framerate for image quality (which they do frequently), that's a loss for me. Having the option to change that can only benefit me and can't possibly hurt you.