• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Judging game performance at "max settings" is enormously counterproductive

Durante

Member
Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.

These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.

In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shit port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.

I hope everyone can see how fucking insane this is. As a developer aware of this, I basically have 2 options:
  1. Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
  2. Simply don't bother with higher-end settings at all.
The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?

Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.
 

KJRS_1993

Member
To be honest, I wouldn't spend a huge ton of time worrying about the kind of people who would judge a games performance off such an arbitrary metric.
You won't ever please them, and they are such a minority compared to the rest of the userbase anyway.

You wouldn't believe it if you browse here too much, but most people just want to play and enjoy something, and aren't too fussed about the technical details. (as long as it turns on and works)
 
I feel like this should be self-evident, yet I've seen the argument you mention often to know that apparently it's not. Indeed, what this ends up causing is that companies never bother to add any options exclusive to PC, let alone stuff entailing development time like higher resolution textures, etc. as they (rightfully) see them as counter-productive.

PC games should be judged and compared on a standard set of features (same resolution, same shadow algorithm, same antialiasing solution and so on) but it seems that's too much of an effort for reviewers, let alone the average PC gamer. :/
 
I always wonder why companies do not "future proof" their games with assets at their original resolution (e.g., id with Rage)... but then people would end up cranking everything up and call it "an shitty console port"...

You should either put a box next to the quality setting specifying what kind of specs you need, or some sort of warning when going beyond what's commonly available at the time of release (and the obligatory unreal "holy shit" when setting everything to ULTRA).

OR, you could unlock the ULTRA setting only after obtaining a certain score to a benchmark tool provided with the game.

EDIT: I do feel that TotalBiscuit does this right as he evaluates performance at max with two Titans or something... and at that point, if your game has frame rate drops it's probably because it has some problems (or am I wrong?).
 

Makareu

Member
Agree, but too many people and review sites just want to judge and draw absolute conclusions.
In the end a PC developper will be lazy or bad (though, some of them really are ), it is a no win situation.
 

OmegaDL50

Member
I see your point.

Not everyone is going to max out the game and see it as the "default" experience for majority mainstream userbase that plays a game.

The majority of PCs and people playing PC games aren't running high end GPUs and setting the details to max.

So using that as a measuring stick of what to expect from a performance barometer, would be highly misleading because the general masses wouldn't be getting that same experience that a smaller audience of Enthusiast level PC owners typically consider the norm.
 

mrklaw

MrArseFace
But the flip side of your argument is that nobody would ever review the technical performance of a PC game, because no matter what they are running it on, they can lower the settings to make it run smoothly.
 

JaseC

gave away the keys to the kingdom.
I always wonder why companies do not "future proof" their games with assets at their original resolution (e.g., id with Rage)... but then people would end up cranking everything up and call it "an shitty console port"...

The thing with Rage is that the on-disc assets are accurate representations of the source assets:


The game was built with the lowest common denominator in mind, which in this case was the storage capacity of two X360 DVDs.
 

Kezen

Banned
I agree with everything said in the first post. It's appallling to see a non neglectible portion of the PC audience crying foul when a game has the audacity to offer very demanding settings. Not only that but they complain about performance without even trying to understand what they are asking of their hardware.
 

Durante

Member
So at what settings should we judge game performance? (or am I missing the point?)
For comparing across games, the default settings. For comparing to the console performance, console-equivalent settings.

Of course you can also benchmark the "max" settings, but every time you talk about them you should be cognizant of what they actually entail.

The truly bad part is really "game A runs at 60 FPS maxed, game B at 15, ergo game A is much better optimized than game B". And yes, that is what far too many arguments about performance and optimization on gaming forums boil down to.
 
Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.

These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.

In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shit port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.

I hope everyone can see how fucking insane this is. As a developer aware of this, I basically have 2 options:
  1. Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
  2. Simply don't bother with higher-end settings at all.
The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?

Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.

Reading PC threads on here ive wondered the same thing. Isnt optimisation how well your hardware runs on lower end hardware? So who cares what fps you can get at max settings?
 

Kezen

Banned
The truly bad part is really "game A runs at 60 FPS maxed, game B at 15, ergo game A is much better optimized than game B". And yes, that is what far too many arguments about performance and optimization on gaming forums boil down to.

Hence folks saying the last DMC is better "optimized" than Crysis 3....
Cringeworthy.
 
If there is a frame killing option, it think developers should atleast put a discaimer or something next to it.

Took me a while to realize that the "POST FX" option in Far Cry 3 kills my framerate if I don't set it to low.
 

Durante

Member
But the flip side of your argument is that nobody would ever review the technical performance of a PC game, because no matter what they are running it on, they can lower the settings to make it run smoothly.
In a technical review, they should review both the performance and the visual quality of the game at the settings it chooses. This forces the developer to implement good defaults, and make a trade-off between performance and image quality.

Then you can separately evaluate how well the game scales up and down, again both in visuals and performance. I realize this is much more work than throwing "maxed FPS" out there, but it's the only way to do it right.
 

Setsuna

Member
I always wonder why companies do not "future proof" their games with assets at their original resolution (e.g., id with Rage)... but then people would end up cranking everything up and call it "an shitty console port"...

rage's textures were ass on release and they are even worse now
 

Abounder

Banned
I've seen benchmarks using high settings instead of max especially if the latter isn't optimized. In my opinion the best compromise is to support both the Star Citizen enthusiasts and average PC users by having preset settings for optimized, high, ultra, etc, but allow dropdown changes if the user wants to. PC players want and expect options, and will mod your game to get them.
 
I don't remember people bitching at The Witcher 2 for being unoptimized yet it had an Ubersampling mode which made the game run much slower.
 

Durante

Member
I don't remember people bitching at The Witcher 2 for being unoptimized yet it had a Ubersampling mode which made the game run much slower.
I do remember some discontent about this, but you are right that it wasn't as bad as for some other games. I think it helped that they only made the option available in the advanced settings, and then even made it bold and red and scary. It's almost like putting it in an ini file.
 

gelf

Member
Totally agree. Max settings is such an arbitrary and variable thing that it means nothing when comparing the performance of PC games. This probably explains why I've played many a PC port I've seen called "unoptimised" and come away not seeing what the fuss is about.
 

Almighty

Member
I think I agree. Just showing a game of at max setting isn't showing the full picture and personally I would like to see how a game preforms across a wide range of settings and hardware. Though I think a lot of problems would be lessened a bit if people doing the tests took more time to explain what exactly the game was doing with max setting and how that is effecting performance. For a console port it might also be helpful to show how the game preforms on equivalent setting to the consoles and then how it preforms on max setting while explaining what the difference if any is between the two settings.
 
As mentioned previously by others, devs need provide an option called "Console Settings" which would actually give us a good idea at how well optimised a PC port is.
 

MaLDo

Member
A big problem with graphics settings is that most advanced effects produce a very thin difference for untrained eyes while being the most complex, and those effects are usually the newer ones so they are the less optimized making them the most that harm performance.

Taking account the open nature on PC hardware, only in a pc games can happen that a developer will add a new effect that has not yet reached a good balance between visual outcome and performance impact. So is there where no tech experienced users will complain the dev audacity.
 
The thing with Rage is that the on-disc assets are accurate representations of the source assets:

The game was built with the lowest common denominator in mind, which in this case was the storage capacity of two X360 DVDs.

So, you're really telling me that some 2d artist actually made the rightmost box like that? Drawing small blurry dots instead of text?
http://hothardware.com/newsimages/Item19178/small_rage-text.jpg

Or even worse... these food cans?
http://i.imgur.com/z8uDh.jpg

Personally I think it's more trouble drawing them like this... and Carmack is probably lying... don't you?

rage's textures were ass on release and they are even worse now

Yup. The quality itself is awful, but the rock textures look really impressive from a distance, really organic (that's the whole point of megatextures...)
 

JaseC

gave away the keys to the kingdom.
rage's textures were ass on release and they are even worse now

The sharpening filter is optional.

So, you're really telling me that some 2d artist actually made the rightmost box like that? Drawing small blurry dots instead of text?
http://hothardware.com/newsimages/Item19178/small_rage-text.jpg

Or even worse... these food cans?
http://i.imgur.com/z8uDh.jpg

Personally I think it's more trouble drawing them like this... and Carmack is probably lying... don't you?

I don't see why he'd lie about most of the source assets not having more detail when he could have just dismissed the calls for a high-res texture pack by saying that it'd be a prohibitively large download. That crappy box in your image would presumably fall outside the scope of "most".
 

mclem

Member
I've noticed that I'm increasingly playing PC games using my upstairs desktop to provide the power, but through my downstairs laptop plugged into the TV so I get the comfy couch experience, using Steam Home Streaming. That's glorious, but it does mean I have to consciously make concessions about fidelity, and I'm always having to view any details about how a PC game looks through that filter.
 

C.Dark.DN

Banned
System requirements need to list a rated resolution, frame rate, and corresponding graphics setting.

That would improve optimization for each tier. Instead of "lol, you figure out how to hack it and have it run at ultra."
 

Sectus

Member
I think another part of the problem is that there's quite a few people who would go straight for max settings because they want the prettiest graphics, and they don't consider the fact it might not be playable unless they have the highest of high end.

I think the best solution is to name the highest settings something which sounds like it's above "max", and give the user a really big warning when he first tries to access those settings.
 

Spazznid

Member
Like I said in the Ultra/Max thread, I think that this would blur a lot of people's perception of PC capabilities. I for one would rather have an Ini-based standard to increasing graphical settings for those who:

A. Know what they're doing.
B. Care about it at all.
C. Have the capability to use such settings.


The chances that developers would use low performance as a way to brag about their game's specs is too much for me to agree with.
 
I don't remember people bitching at The Witcher 2 for being unoptimized yet it had an Ubersampling mode which made the game run much slower.

They were pretty clear on that point. The option is highlighted in red for one and they said in advanced it wasn't meant as an option to run on PCs at that time.
 
In some senses 'optimise' is a poor term because it also refers to what the user has to do in balancing the graphical settings of a game to the hardware they're using to run it.

When applied to developers I take optimisation to mean making performance as lightweight as possible. At the same time equalising the scenarios the game will produce so that performance is reliable (ie. not interspersing simple scenes that run well on high settings with complex ones that chug).

Edit: Options are never a bad thing in my book. I'd consider a game optimised if it included settings that let low-end systems run the game, together with settings that let high-end systems go wild. And in both cases, gave performance that was consistent.
 

Denton

Member
Devs should do much better job with description of various graphical settings in games, including info about what specs should be used. I would also recommend putting the demanding, high-end options behind some kind of expert screen with disclaimer that these are future proof and not to be easy on hardware.
 
Without context people will just make assumptions. But I don't think it's true that people whine about games being unoptimized pieces of shit *in all cases* when "maxed settings" isn't realistic on release. Did many people complain that Ubersampling in The Witcher 2 was unavailable to most gamers, and that it tanked framerates? I don't recall ever seeing anyone accuse the game of being unoptimized because that setting existed. This is probably because the devs explained that setting in pre-release info and it was pretty well known that it wasn't supposed to be a "sane" setting at the time. I would contend that this is an issue of communication.

If you as a developer want to avoid misunderstandings, it might be beneficial to provide detailed information about each setting in your game. In most cases, settings pages are sterile checklists and sliders with at most one line of text explaining what it does (and sometimes not even that). If you want to include higher resolution settings in an effort to future proof your game or give something for people on ultra-high end rigs to strive for, perhaps explain that in the game. That it's not intended to work on existing rigs at a reasonable framerate and that it will still look good without it.
 
The sharpening filter is optional.



I don't see why he'd lie about most of the source assets not having more detail when he could have just dismissed the calls for a high-res texture pack by saying that it'd be a prohibitively large download. That crappy box in your image would presumably fall outside the scope of "most".

Yeah. I didn't give much weight to the word "most"...
Still, I find more distracting a blurry object where there was clearly supposed to be text on it... they could have at least released those textures which, as you say, fall outside the scope of "most".
 
I feel like this was something that hurt Final Fantasy XIV 1.0.

The game could have been better coded. And it was kind of demanding for an MMORPG. But a lot of the criticism of Final Fantasy XIV was that needed too nice of a computer. When a lot of people were running the game on maximum settings anyway.
 
For comparing across games, the default settings. For comparing to the console performance, console-equivalent settings.

Of course you can also benchmark the "max" settings, but every time you talk about them you should be cognizant of what they actually entail.

The truly bad part is really "game A runs at 60 FPS maxed, game B at 15, ergo game A is much better optimized than game B". And yes, that is what far too many arguments about performance and optimization on gaming forums boil down to.

I believe this is what DF was doing with the later PS360 multiplats, they compared the consoles with the PC versions at 720p settings.

They seemed to have changed their methods again as they're setting the current gen consoles against pc games at 4K resolutions
 

Slavik81

Member
This sounds like it's partly a UX problem, as games typically provide little context as to what the settings entail.
 

Arulan

Member
As I posted in the other thread:

One thing I'd like to mention is that the obsession for "maxing" out a game can be harmful to the PC industry. Ideally all games should not be able to be "maxed out" until years after release because of how much they're pushing the tech. This does not mean it shouldn't be well-optimized or that what can be achieved today, say "High" settings for example shouldn't look as good as what we expect today when we "max something out".

I think this mentality is very harmful to PC games and I would love to see games push their tech as far as possible.
 
In my book a port (or otherwise) is shoddy when I'm at or well over the recommended system requirements and the game runs terrible, even on low settings/resolutions.
I would never gripe about not maxing a game when you see some of the "insane" settings that are more geared to high end/future hardware.

But in a pc environment where tweens spout all sorts of childish pc elitism I'm not all to surprised by the attitude.
 

RooMHM

Member
Created pre configured settings for the masses up to high. Add ultra with yet other ressource demanding options and for the love of god allow ini customization and CONSOLE!
 

Durante

Member
Devs should do much better job with description of various graphical settings in games, including info about what specs should be used. I would also recommend putting the demanding, high-end options behind some kind of expert screen with disclaimer that these are future proof and not to be easy on hardware.
These seem to be good basic guidelines. It's more or less what Witcher 2 did.

I also agree with other posts that settings should be well-described.
 
Gotta agree with the OP, but I also really think devs should be more transparent in their settings and what their settings do and giving access to more cVars. I was mentioning to MaLDo quite recently how much I love CroTeam games due to the ridiculous amount of graphical and performance settings accessible to the user (in menu). These must be ridiculously confusing for most people though.

On the other hand....

I like the idea of Ultra being hidden in an .ini file (in fact, that sounds like a wonderful way to prevent the "unoptimized yelling.") Imagine if Crysis 1 launched with Very High only being in the config file... that game would have an incredibly different reputation for better and for worse. This would in general leave the enthusiast settings... to the enthusiasts!
 
Well, most of the PC benchmarks are kind of pointless. If you aren't one of the few people who upgrade hardware every year, you will need to work out the right settings anyway.

Such articles should rather focus which graphics effects are resource hogs.
 
I do remember some discontent about this, but you are right that it wasn't as bad as for some other games. I think it helped that they only made the option available in the advanced settings, and then even made it bold and red and scary. It's almost like putting it in an ini file.
So, were you digging through Watch_Dogs when you decided to make this thread?

I completely agree. They should throw a disclaimer next to the max setting stating that some features may adversely affect performance and call it good.
 

Dr Dogg

Member
I do feel the words 'optimised' and 'unoptimised' get thrown around with reckless abandon when discussing how a game performs. Realistically the focus should be on how scalable a game is and not this arbitrary level of performance some expect from their hardware no matter how costly the setting or effects they're trying to push. If you can't get the performance you desire at the settings you have chose but you can when adjusting the settings to suit your hardware then you have a scalable game. If you get bad performance across every combination of settings and you're not using outdated or unreasonable hardware then there might be an issue with the game (though I can think of about 2 games in the last decade that have been like this for me).

One thing that I find is frustrating is some games decide to cluster lots of their settings under one toggle. Sometimes there might be a Level of Detail toggle with 4 options with names that aren't really conveying what you are adjusting but you end up adjusting or enabling/disabling 3,5,6, maybe more effects at once. I get this makes it easier for people who don't understand the performance cost of such effects or not overwhelm or confuse them so they can just fire and forget to something that runs but that doesn't afford anyone the ability to tailor the game to the most optimal performance possible. Some do let you break apart these settings in the ini files but others don't.
 

mrklaw

MrArseFace
I've noticed that I'm increasingly playing PC games using my upstairs desktop to provide the power, but through my downstairs laptop plugged into the TV so I get the comfy couch experience, using Steam Home Streaming. That's glorious, but it does mean I have to consciously make concessions about fidelity, and I'm always having to view any details about how a PC game looks through that filter.

You shouldn't lose a lot of performance with streaming? Or do you mean 30/60 FPS etc?
 
Top Bottom