• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Judging game performance at "max settings" is enormously counterproductive

I think the most important thing is how it runs compared to a console and what price range PC can run it at those settings. If a 500 dollar PC can run a game betterthan an Xbone that's a positive for PC
 

KKRT00

Member
Yeah, hardware reviewers are also blame for this.
I've seen many, many times people posting screenshots of benchmarks from games like Sleeping Dogs or Metro:LL with ultra settings in 1080p where the framerates were in 30s as a showcase of ridiculous requirements of PC gaming and awful PC ports.
Those people of course did not understand that those games were running 4xSSAA additionally on those settings...
 

Scrabble

Member
This mindeset has always pissed me off as well. Crysis 3 on low settings looks and runs better than the vast majority of games, but because Ultra is an option it's deemed to have shoddy performance. Why?
 

Scrabble

Member
So at what settings should we judge game performance? (or am I missing the point?)

How good the game looks, how large scale are the environments, what fps am I getting, etc. If a pc game on medium settings still runs just as well as any other console port at max settings 8x anti aliasing, etc, yet looks better in the process. How can it be bad optimization?
 

Deepo

Member
They were pretty clear on that point. The option is highlighted in red for one and they said in advanced it wasn't meant as an option to run on PCs at that time.

Ubersampling wasn't highlighted in red at launch, and it was included in the Ultra preset. Durantes point still applies there, but it wasn't as simple to identify that as the performance murderer it is back then.
 
So at what settings should we judge game performance? (or am I missing the point?)

All of them? A good game, ideally speaking, should scale well and have a nice ratio of visual quality/performance in most settings, given the opportunity to enjoy the game to both low-end and high-end consumers.

And in any case, performance itself isn't to be judged simply looking at a Fraps fps number in a corner, but correlating the performance with the visuals given (and other aspects).

And, having settings and how they allow players to tweak the game is also something to be judged.



edit: Just imagine for a moment a game with good visuals and great performance with 100fps at high settings, but a very high setting it falls to 40fps, even if visual quality is very similar. Is the performance of the game good or bad? You could criticize the existence of a visual option that isn't optimized in "very high", which is provoking the fps drop... but why would you play on very high if we just said the graphics are very similar in "high" and the performance is great?
 

Ty4on

Member
This mindeset has always pissed me off as well. Crysis 3 on low settings looks and runs better than the vast majority of games, but because Ultra is an option it's deemed to have shoddy performance. Why?
I was thinking about bringing up Crisis 3. Low looks very similar to very high and can run on integrated GPUs.
 
For comparing across games, the default settings. For comparing to the console performance, console-equivalent settings.

Of course you can also benchmark the "max" settings, but every time you talk about them you should be cognizant of what they actually entail.

The truly bad part is really "game A runs at 60 FPS maxed, game B at 15, ergo game A is much better optimized than game B". And yes, that is what far too many arguments about performance and optimization on gaming forums boil down to.

people bitch when the 15 fps game only looks on par or worse than the 60 fps game. and in that case the bitching is perfectly justified. its pretty much a industry standard at this point that any settings above and beyond the console settings are pretty much nothing but half ass coded crap worked on only enough so the game still functions. performance optimization is completely ignored. typically unless it can simply be enabled by typing a different word in an ini file or nvidia/amd come in and do the coding themselves there will be no extra pc features.
 
Well, judging by the exemple the OP took, I'd say you can judge a port compared to the original release and compare performance to other ports.
 

Calabi

Member
Wasnt there lots of fuss about GTA IV. Everybody was saying it ran like an unoptimised piece of crap when the settings went way beyond what the consoles could do. It was difficult finding the exact settings that were optimal for your pc. I'd go backwards and forwards with them for ages trying to figure out what framerate I wanted with what visuals.

I think they should leave the stupid settings in an ini. Have the max optimised settings in the menu, and you to go into the ini like Skyrim for you to go beyond at your own risk.
 
options that severely cripple performance without substantially improving the visual experience are great examples of poor coding and developer decisions.
 
crysis, metro 2033, witcher 2 and any game with either a 4-8x msaa or a supersampling option included in the menu is an example of this.
Crysis 1 scaled well with its graphics settings, nothing unoptimised about that game.


options that severely cripple performance without substantially improving the visual experience are great examples of poor coding and developer decisions.
No, they are just options, you contrarian

edit; this guy has to be trolling,, no real effort beyond what the consoles are running?
For starters crysis 1 was made for pc, with no intention to port it to consoles until years later, so your argument falls apart right there (as does any pretense of you knowing what you're talking about)

Crysis was the first game to include proper godrays (sounds boring now but people ogled over them at the time) , first game with such a high quality per object motion blur implementation (that looks incredible even today) , one of the first if not the first to implement PoM (people ogling over the rocks and pebbles looking like rocks and pebbles not a flat texture of some rocks and pebbles)
http://gamingbolt.com/wp-content/uploads/2011/09/2gvkaj8.jpg
this blew people's minds back in 2007
Everything about crsysis 1's graphics was years ahead of anything else at the time.
But just because you heard the 'but can it run crysis' meme a few too many times you call it unoptimised, despite people with old pcs just running it just fine on lower settings and the game scaling extremely well when you lowered settings.

You don't even know what unoptimised means.

I'm not even going to dignify the terrible 'poorly coded/designed' options crap
 
crysis, metro 2033, witcher 2 and any game with either a 4-8x msaa or a supersampling option included in the menu is an example of this.
Crysis 1 scaled well with its graphics settings, nothing unoptimised about that game.



No, they are just options, you contrarian

crysis 1 was terribly optimized. they are poorly coded/designed options, especially when theres sane and reasonable ways to improve visuals. but again, no developer is going to put any real effort in beyond what the consoles are running.
 

Vitor711

Member
crysis 1 was terribly optimized. they are poorly coded/designed options, especially when theres sane and reasonable ways to improve visuals. but again, no developer is going to put any real effort in beyond what the consoles are running.

But crysis 1 never came to consoles originally... The 360 and ps3 port was clearly an afterthought.

And I agree that developers should more accurately describe the impact that game settings will have. Perhaps even a rough percentage performance impact estimate too.

Most of my tweaking involves turning each setting off and on individually until I work out where the performance hit is coming from. For games that require a full restart for each setting to take place... Ugh.
 

Kezen

Banned
but again, no developer is going to put any real effort in beyond what the consoles are running.
100% wrong. We already have examples of games having better settings than what's available in the console skus. More are to come of course.

options that severely cripple performance without substantially improving the visual experience are great examples of poor coding and developer decisions.
I disagree. Just because you don't happen to notice the higher quality settings does not mean they have not been properly implemented and optimised.
Just an example : fog shadows in Crysis 3. Demanding for sure but the impact on visuals justifies it.
 

Seanspeed

Banned
Well, most of the PC benchmarks are kind of pointless.
They're not useless at all. You just have to be aware of what, specifically, they are useful for. There are different forms of benchmarks and various testing methods you can use in order to figure out certain things.

The single hardest thing to 'benchmark' is the somewhat subjective but also somewhat objective notion of 'How well does a game run?'. Determining this is painful because the points of comparison are usually other games or any console version of the game. Comparing to other games is flawed because games all run differently, do different things, and how 'good' they look can be quite subjective. Comparing to console versions can be difficult because PC versions often have more advanced graphical effects and different AA methods and whatnot.
 

Durante

Member
people bitch when the 15 fps game only looks on par or worse than the 60 fps game. and in that case the bitching is perfectly justified. its pretty much a industry standard at this point that any settings above and beyond the console settings are pretty much nothing but half ass coded crap worked on only enough so the game still functions. performance optimization is completely ignored. typically unless it can simply be enabled by typing a different word in an ini file or nvidia/amd come in and do the coding themselves there will be no extra pc features.
You are completely wrong about pretty much everything.

Of course, a cursory glance at your post history reveals that your goal with this drivel is not to be correct, so it doesn't seem worth it to engage your individual points.
 
100% wrong. We already have examples of games having better settings than what's available in the console skus. More are to come of course.


I disagree. Just because you don't happen to notice the higher quality settings does not mean they have not been properly implemented and optimised.
Just an example : fog shadows in Crysis 3. Demanding for sure but the impact on visuals justifies it.

yeah better settings that tank performance and are barely distinguishable
 

Gbraga

Member
For comparing across games, the default settings. For comparing to the console performance, console-equivalent settings.

Of course you can also benchmark the "max" settings, but every time you talk about them you should be cognizant of what they actually entail.

The truly bad part is really "game A runs at 60 FPS maxed, game B at 15, ergo game A is much better optimized than game B". And yes, that is what far too many arguments about performance and optimization on gaming forums boil down to.

I certainly remember people being mad because they couldn't run Crysis 3 on Very High with their PCs, even though "just High" was still more demanding than pretty much every other game they played on "ultra".

Hence folks saying the last DMC is better "optimized" than Crysis 3....
Cringeworthy.

Ha, exactly what I had in mind
 

Baleoce

Member
The truly bad part is really "game A runs at 60 FPS maxed, game B at 15, ergo game A is much better optimized than game B". And yes, that is what far too many arguments about performance and optimization on gaming forums boil down to.

Yeah I think this is the focus of your statement. I think the reality also is that a lot of reviews simply don't have the technical know-how to be able to consistently evaluate what makes one PC port better than the other, and for what reasons other than how their particular machine reacts to those settings. I like the idea of looking at a game from the perspective of "default setting", "console grade setting" and then after that you can look at the more technical high-end optimizations and how they affect negatively or positively on the experience, and why.

I think it's fallen into a bit of a convenience rut right now where charts and graphs will make us look at a glance and know that this gpu can handle this game to that extent etc.
 

Seanspeed

Banned
yeah better settings that tank performance and are barely distinguishable
If you've got the performance to spare, then its nice to have a little extra detail or flourishes.

I agree many of the Ultra, highly demanding settings often don't add too much individually, but everybody is sensitive to things to a different degree. I know Durante and a few others can pick out differences in AO techniques very easily. Personally, I probably could never tell the difference between SSAO and HBAO+ unless they were shown to me side by side. But that may change in the future. I'm far more aliasing-sensitive than I used to be and I'm often upset when I cant downsample my game from 1440p to alleviate the aliasing that's bothering me. Others might not care quite as much. And of course, things can add up. Turn down one setting from Ultra and it might not be a huge deal. Turn down all the settings from Ultra and you might suddenly have a noticeably degraded image.

What's great about it is its all optional. And its usually not too hard to get decent performance out of a game even with a moderate rig by turning down settings, based on which are least impactful/important to you. That freedom is quite wonderful.
 

Reese-015

Member
Tbh this is a matter of semantics. I vaguely remember Tyrian did it in a cool way back in the day, you had a high quality setting, above that you even had a Pentium setting (Pentiums were quite new) but above that, and this is what I wanna focus on, you had some sort of 'Future' setting.

If you know that you're making a very high-end max graphics profile, make it very clear that this profile only targets a very small percentage of your audience, unlike the max settings with most other games. If most other games run fine at maxed out settings on ordinary recent 'gaming rigs' and yours does not, you can simply make sure to communicate in some shape or form that your 'max' settings are different from what people normally expect from em. I'm sure you can manage to do that in such a way that people won't benchmark against the max settings if you don't want em to.
 

MultiCore

Member
I've always been a max settings kind of guy, with the exception of antialiasing. I've always taken the time to learn what each setting does, and I've evaluated the performance impact for myself.

I've always loved games that offered options beyond the reach of contemporary hardware, and going back check them out after an upgrade can be immensely satisfying.(Doom 3, Crysis, heck even the original Doom. When I upgraded from CGA to EGA to VGA, any game that had the option of taking advantage of it got played at least a little bit.)

That said, I understand the point you're making. The masses aren't interested in really understanding any it.
 
Fully agreed. The obsession with max settings may be understandable for benchmarks and reviews or for those gamers that want the best image quality possible but for the rest of us, the untrained eye as Maldo described it, the increase in quality when going from High to Ultra is frequently not worth the performance penalty. Mostly stable 60fps takes precedence over any available quality settings. That said, I want those settings there for those who can appreciate them.
 
The thing I'm most impressed by is great scalability. When a game can be a technical show piece on top end hardware but also run respectably on modest hardware, that's when I feel the developers took the time to take advantage of the strengths of the platform.
 
Fully agreed. The obsession with max settings may be understandable for benchmarks and reviews or for those gamers that want the best image quality possible but for the rest of us, the untrained eye as Maldo described it, the increase in quality when going from High to Ultra is frequently not worth the performance penalty. Mostly stable 60fps takes precedence over any available quality settings. That said, I want those settings there for those who can appreciate them.

And for when I buy a new gpu.
Don't need to be sold any shitty 'remastered editions' of 2-4 year old games on pc thanks to pc gaming having options that scale up on powerful gpus.
Remember when almost all pc games used to have a bunch of super demanding options that would scale to future hardware? Those were good times
 

Valnen

Member
And for when I buy a new gpu.
Don't need to be sold any shitty 'remastered editions' of 2-4 year old games on pc thanks to pc gaming having options that scale up on powerful gpus.
Remember when almost all pc games used to have a bunch of super demanding options that would scale to future hardware? Those were good times

The problem with this is, most PC gamers these days just want to crank the slider up without tinkering. Me included, honestly.

Part of the appeal of PC gaming these days for me is you get better graphics and don't have to mess with anything on most games.

...But at the same time I totally see the appeal to developers of wanting to provide the best possible options, even if most (or any) hardware isn't up to the task yet. Ultimately I think tinkering could be worth it if it means no compromises from the developer.
 
And for when I buy a new gpu.
Don't need to be sold any shitty 'remastered editions' of 2-4 year old games on pc thanks to pc gaming having options that scale up on powerful gpus.
Remember when almost all pc games used to have a bunch of super demanding options that would scale to future hardware? Those were good times

Indeed they were!
 

Vitor711

Member
Fully agreed. The obsession with max settings may be understandable for benchmarks and reviews or for those gamers that want the best image quality possible but for the rest of us, the untrained eye as Maldo described it, the increase in quality when going from High to Ultra is frequently not worth the performance penalty. Mostly stable 60fps takes precedence over any available quality settings. That said, I want those settings there for those who can appreciate them.

Same. I can get a near steady 60 fps with crysis 3 at high running at 1080. If I go to ultra, it's worth me locking it to 30 but the visual increase doesn't justify the performance hit.

That being said, the game still looks phenomenal at high, so it doesn't bother me.
 

Durante

Member
Tbh this is a matter of semantics. I vaguely remember Tyrian did it in a cool way back in the day, you had a high quality setting, above that you even had a Pentium setting (Pentiums were quite new) but above that, and this is what I wanna focus on, you had some sort of 'Future' setting.
I think semantics are a part of it, yes. As can be seen in the case of Witcher 2. Another thing developers absolutely need to do is document each setting and its expected performance impact (though this can be more challenging than you might expect).

However, I believe it's also important to
  • Raise awareness among gamers and reviewers alike about the issues with comparing "maximum settings".
  • Clarify once and for all that additional settings never degrade the technical quality of a game (unless, of course, if they were to be completely broken). You don't need to use them.

The thing I'm most impressed by is great scalability. When a game can be a technical show piece on top end hardware but also run respectably on modest hardware, that's when I feel the developers took the time to take advantage of the strengths of the platform.
Agreed. That's another reason why Crysis 3 is so impressive.
 

Kezen

Banned
Remember when almost all pc games used to have a bunch of super demanding options that would scale to future hardware? Those were good times
I do remember that time. Mafia (the first one) was colossally demanding, and that was a great pleasure to play two years later at max settings.
 

Freeman

Banned
Some settings should have a warning before you turn them on, some game gives you options that can kill the performance for very little return. I don't think they should be hidden, they should just make it clear what it is and the impact if will have on performance.

There also should be an option to automatically tune the game to get to a certain target FPS based on a set of priorities and a benchmark. Right now games rarely offer reasonable settings if you let is chose automatically.
 
Some settings should have a warning before you turn them on, some game gives you options that can kill the performance for very little return. I don't think they should be hidden, they should just make it clear what it is and the impact if will have on performance.

There also should be an option to automatically tune the game to get to a certain target FPS based on a set of priorities and a benchmark. Right now games rarely offer reasonable settings if you let is chose automatically.
Well, that's what Nvidia's Geforce Experience and AMD's Raptr allow you to do. The problem is that neither software gives you a projected FPS target as you adjust the settings. They also don't know when to lower some intensive settings, like Far Cry 3's Post FX.
 

BibiMaghoo

Member
I would imagine the number of people that max out each game is incredibly small, even if high among this forum. New games require a beefy PC by normal standards, and average by our own.

From this alone it's kind of redundant, because the vast majority of people will never run or see a PC game at it's maximum values. As enthusiast, we look to the highest end of what can be achieved, and if these performance judgements out for us alone, then they have some relevance in regards to returns on our expensive hardware. As general public? They are utterly useless, and even misleading as to what a customer is going to get when they buy a game.
 

Stallion Free

Cock Encumbered
I love over the top ultra settings because they are the icing on the cake that is a 4K or 8K PC "remaster" replay 2-3 years down the road that I'm not be charged 50-30$ for.
 

blado

Member
Every time I see a performance analysis of Metro Last light which says modern cards struggle at 1080p, when they set SSAA to max, I cry a little.
 

DeaviL

Banned
The biggest problem i have, is that my options always range from low to very high. I don't know if medium in game A corresponds to very high in game B or the other way around.
I always feel bad about having to turn down my settings because while i am easily satisfied with the graphics of a game, lowering them makes me feel like i'm playing something sub-optimal.
If i knew what settings corresponded to settings in other games, and if i knew which setting i did or don't benefit from with my resolution at 1080p (like for example: a Very high shadow resolution) then i'd be far less "obsessed" with max settings.
 
And for when I buy a new gpu.
Don't need to be sold any shitty 'remastered editions' of 2-4 year old games on pc thanks to pc gaming having options that scale up on powerful gpus.
Remember when almost all pc games used to have a bunch of super demanding options that would scale to future hardware? Those were good times

Indeed they were!

I take it that they no longer are?

<<<< Not a PC gamer
 

BibiMaghoo

Member
The biggest problem i have, is that my options always range from low to very high. I don't know if medium in game A corresponds to very high in game B or the other way around.
I always feel bad about having to turn down my settings because while i am easily satisfied with the graphics of a game, lowering them makes me feel like i'm playing something sub-optimal.
If i knew what settings corresponded to settings in other games, and if i knew which setting i did or don't benefit from with my resolution at 1080p (like for example: a Very high shadow resolution) then i'd be far less "obsessed" with max settings.

I found the cure for this ailment. When running a new game, turn everything to lowest. Then gradually increase them until you take a hit, then roll that one back. This way, you don't go from "OMG that looks amazing but runs like shit" too "This runs better but looks like shit compared to that".

You just get "this looks much better than it did before I changed that"

If you get me :)
 
What if game devs were more communicative with what settings did and warned them as such? I'm seeing more of this happen these days but in the OP's case it would be neat if game settings had Video Settings - Advanced Settings - Experimental Settings where each tier is more likely to mess up your game's performance. I think people would be less likely to max out everything in a category called "experiemental" and then proceed to whine about performance than they would be for "advanced settings". It's pretty silly but I think it would help things.
 

Dario ff

Banned
Another good suggestion for graphics settings (hidden on a configurable .ini or not) is that anything that is scalable should have a multiplier value, and anything that is a toggle should be a toggle. Hiding graphical toggles behind sliders is just confusing and unintuitive, or even worse, when on the .inis you get settings such as post_processing level = 2. That really means absolutely nothing to the user. What if there's a post-processing effect that I don't like hidden in there? It happens quite often with stuff like bad Motion Blur implementations.

Presets are fine and should indeed be encouraged, since they hint at a certain level of detail/performance tradeoff that the developers were fine with and should work on most hardware at the time of release.

Like other people said, Serious Sam 3 is a great example of this. You don't need to understand half the settings and the presets are pretty good, but you have a lot of control over pushing the game as much as you want.
 

Jedi2016

Member
My problem is that, while I will happily lower options to improve performance, it's nagging me that the game could look a little better. I do understand what most options in games do, though, which ones I absolutely must have (16x AF), and which ones I might be able to do without (MSAA in favor of SMAA, etc).

I'll sometimes experiment to see what effect a particular setting has versus what visual impact it has. For example, some games' "medium" shadows look almost the same as the "ultra" shadows... they're not as accurate, but the overall look is very close, and the difference goes more or less unnoticed, while the extra 10fps I get is definitely noticeable. But that doesn't always happen either, depending on the game, if the shadows start getting blocky, I'll crank them up regardless of how much performance it takes.

I also do 3D stereoscopic gaming, so I often have to make different concessions for that, due to the 50% performance hit just from playing in 3D.. I'll often have to back off on some settings to get it back where I want it. There's a smattering of games I can play at 60fps at max settings in 3D, which is most excellent.

The other problem that comes into it isn't so much how it performs overall, but how it performs on my machine. Unless you've got the exact same specs I do, there's only one way to know how it's going to really perform for me, and that's to play it myself. This is why I really appreciate games that release demos (which is becoming rarer and rarer these days) or even just benchmarks.

Another problem I find with some settings is when developers, I may wind up calling them lazy just out of spite, lump a bunch of different settings into a single on/off switch. "Post FX" is the worst at this, because it often includes things like AO, bloom, volumetric lighting (sun shafts and the like), sometimes even AA, on a slider that just says "Off/Low/High", without any explanation of what it's doing. I'd much rather have an options page that I have to scroll through a couple pages than just a small handful of settings that don't mean anything. I know what all those things do, it's okay to give me the option to turn each of them on and off myself.
 

BPoole

Member
I always find to dumb when I see benchmarks of games running in 1440p+ and have 8x MSAA and only get ~35fps and we see people crying about unoptimization.
 

Grayman

Member
Crysis certainly did get hit with this stigma. I think the game scales up and down really well but there was this reputation of being unoptimized because the sky was the limit on settings.

On the other hand, at least I did not perceive The Witcher 2 getting rolled on as hard because the killer option was labeled as ubersampling and clearing documented as this is not built for PCs today.

Hardware benches are always going to stress parts as hard as they can. Its unfortunate that it puts out the low numbers for parts that a lot of people own. In actuality the end user is best served by a game specific performance review that says what settings their midrange or lower gear can run the game at certain framerates.

So it seems like to keep giving the high end settings for the most extreme users and the future games need to hide the options or call them bullshit mode or something so modest specs aren't even tested on it by benchmark sites.
 

Seanspeed

Banned
Another good suggestion for graphics settings (hidden on a configurable .ini or not) is that anything that is scalable should have a multiplier value, and anything that is a toggle should be a toggle.
Kinda off-topic but I would love a settings screen where you actually had to flip little toggles and pull sliders.
 

Teeth

Member
Every time I see a performance analysis of Metro Last light which says modern cards struggle at 1080p, when they set SSAA to max, I cry a little.

To be honest, if you're adding an SSAA option without any explanation, you are wildly overestimating the knowledge base of your audience.
 
I agree that most people throw around the word "optimized" without really understanding anything about the underlying techniques used, and base it on some ambiguous gut feeling instead.

Personally I think PC games should try to get rid of mandatory settings screen twiddling. Most people have no clue what things like SSAO, SMAA, SSAA or any other term on the settings page means or how it affects the graphical quality and performance. And no developer should expect it.

The masses end up using really crappy presets or even worse, Geforce Experience settings that ultimately aren't anywhere close to being optimal. The games should have good enough detection methods and perhaps a small initial benchmark that auto-detects optimal settings. Of course this has existed in some form for a long time, but as I said, usually the automatic detection is shoddy at best.

The other thing about game graphics I would like to see more is dynamic scaling of graphics during rendering. Things like dynamic resolution rendering, but also for effects and such, so that the engine could could adjust the graphics on the fly and reduce the load for example in fast paced combat where certain effects might not be missed, and then increasing the detail in some close-up scenes with little movement. I'm not quite sure what it is that makes this kind of thing so difficult that no developer seems to even attempt. Ultimately the game should be deciding the optimal graphical presentation and no user should ever see page full of arbitrary settings and sliders.
 
Top Bottom