• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Judging game performance at "max settings" is enormously counterproductive

Bsigg12

Member
System requirements need to list a rated resolution, frame rate, and corresponding graphics setting.

That would improve optimization for each tier. Instead of "lol, you figure out how to hack it and have it run at ultra."

That wouldn't work though since they have no way for accounting what other processes a user is running in the background. It would then turn to people bitching that they have that exact hardware spec and are getting worse than advertised performance.
 

RiverBed

Banned
It's simple, really:

-The degree of interest (or love, dedication, etc.) doesn't make one better- be it gaming, religion, political view, activism, etc.

-The commonality of what one does doesn't make one better; everyone is doing it, no one is doing it, etc.

-Time spent on anything doesn't make one better ('I spent X hours watching this', 'I spent Y hours smelling that' is irrelevant from anything other than literally showing how much time you did something. You could be good or an idiot).

There are more points, but why continue with the obvious. Self-reflection and group-reflection differences are obvious.
 
As I posted in the other thread:



I think this mentality is very harmful to PC games and I would love to see games push their tech as far as possible.

I don't think many multiplatform devs are going to put much effort into PC versions when tons of people just buy them for a handful of dollars on steam sales. Especially if that extra work will only be seen in 2-3 years with better graphics cards.
 
I take it that they no longer are?

<<<< Not a PC gamer
You still have them in some games (like metro 2033 had them) , and more and more games are offering an ingame setting for supersampling (image quality)
But in the old days a lot of games had some really ambitious settings or options that current hardware couldn't do properly, or that only the very very latest gpus or processors supported.
For the majority of games it's just stuff like ambient occlusion, higher resolution shadows, higher precision post effects , full dynamic range lighting etc.
They can be quite demanding but are nothing exotic, just superior quality versions of the lower end settings.

Maybe when dx12 releases and there's some new features we'll see some of that more ambitious stuff again.
When dx11 gpus were new you had some too but they were almost always tacked on and half assed, as an aftertought.
There were some exceptions like stalker call of prypiat which had some advanced soft shadows and cool volumetric smoke and some tesselation implementation that worked pretty well.

A true modern equivalent of the oldschool 'future settings' would be if you had an ue4 game where you had the option to swap the ligthmass baked lighting (non dynamic, precalculated lighting and shadows) for the global illumination they showed in their tech demos 2 years ago (which looks like a full generational leap over lighting in previous games if you see it in motion) before deciding consoles and the average league of legends pc weren't powerful enough for it and scapping it.
In better times epic would have kept this feature intact and would have supported it in one of their flagship releases on pc, just because they could and just because it is awesome.


Kinda off-topic but I would love a settings screen where you actually had to flip little toggles and pull sliders.
Proto cod kind of had you covered
mohaaconsole.jpg
 
Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.

These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.

In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shit port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.

I hope everyone can see how fucking insane this is. As a developer aware of this, I basically have 2 options:
  1. Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
  2. Simply don't bother with higher-end settings at all.
The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?

Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.


They could end all this by just adding a "console" preset alongside low,medium,high,custom.
 

Arulan

Member
I don't think many multiplatform devs are going to put much effort into PC versions when tons of people just buy them for a handful of dollars on steam sales. Especially if that extra work will only be seen in 2-3 years with better graphics cards.

Please don't spread misinformation. It has been demonstrated over and over again that not only do Steam users buy games at launch in large numbers, but that the affect of Steam sales is usually always a positive for their revenue.

Could someone post the graphs/information I'm referring to, I don't have them bookmarked.
 

eot

Banned
I think that if you're going to include graphical options that absolutely kill performance it's a good idea to be more explicit about than just hiding it behind an "Ultra" preset or something like that. I'd rather have it be in the .ini because then I at least know what I'm getting. That would help people who do benchmarks too, it might not be obvious what a catch all like "postprocessing" does at different settings.
 
If there is a frame killing option, it think developers should atleast put a discaimer or something next to it.

Took me a while to realize that the "POST FX" option in Far Cry 3 kills my framerate if I don't set it to low.

FC3 is actually fundamentally broken though. I cannot get the graphics settings any lower and I still see huge frame dips and stutter. Sometimes I even get treated to an annoying bug that randomly dips to 30fps and stays there until I change a graphic setting. This is on a 3gb 580 and the performance is close to identical on lowest and max. I gave up trying to play it.

Post FX is heavy on the card but it's not the only problem plaguing the game. I've yet to find anyone who got it to stay at 60 either.
 
Are you talking about professional game reviewers or amatuers on message boards cause while I admittedly have little experience with pc reviews (pc gamer, rock paper shotgun) I have a hard time believing a pc game review is going to dock points based on the criteria you mentioned. It would be insane but I don't really know enough to say if this is happening. Actually, let me do some quick research.

Yup, just like I thought, in every review i read where performance was mentioned, nothing like you describe was occurring. Reviewers only mentioned framerate issues after testing on multiple hardware configurations at multiple setting configurations and let's be honest, if your game runs like shit on the newest, most powerful hardware at mid to high settings, you should be called out. I read a few reviews that mentioned framerate issues but gave it a pass saying they get a decent 30 fps at mid settings on older hardware. One review mentioned watch dogs had low frames at low settings on good hardware and this is something that should be inexcusable.

Nowhere did I read a review taking a visually complex game to task for running poorly on 'max' settings. They just dial it down and seem satisfied to get a solid 30 fps. Often they praised a game for being highly scalable. I'm guessing you've worked on some games with some perceived performance issues? I'm also guessing you aren't talking about professional pc game reviews because even ign and gamespot are in no way guilty of the things you describe in your post. Cheers.
 

Durante

Member
I stopped reading "professional" game reviews years ago, so this is primarily about forum discourse. However, the issue and related problems also occur in some professional writing which I do read, e.g. hardware reviews, comparative analysis or semi-technical stuff such as the 4k gaming article mentioned previously (which provided 4k results with 4xSSAA for one game, exclusively).
 
Also, being clear that I know little to nothing about this stuff( though am the proud owner of a new built from scratch gaming pc!) 'optimization' to my understand is how much of your CPU and GPU the game utilizes vs how well it runs. For example, arma 3 barely seems to utilize my gpu or CPU yet it runs like shit, on low or high settings. As a result I call it poorly optimized. Is this correct?
 

Durante

Member
Also, being clear that I know little to nothing about this stuff( though am the proud owner of a new built from scratch gaming pc!) 'optimization' to my understand is how much of your CPU and GPU the game utilizes vs how well it runs. For example, arma 3 barely seems to utilize my gpu or CPU yet it runs like shit, on low or high settings. As a result I call it poorly optimized. Is this correct?
No. You really can't simply look at these utilization numbers and tell how well optimized something is. I can write something which utilizes 100% of both your CPU and GPU and is still horrendously inefficient.And the inverse is also true, though to a somewhat lesser extent.
 

Qassim

Member
Reading PC threads on here ive wondered the same thing. Isnt optimisation how well your hardware runs on lower end hardware? So who cares what fps you can get at max settings?

Assuming you mean how well your software runs on lower end hardware, then no, not necessarily.

A game could be extremely demanding for even the best currently available GPUs, but could also be a very well optimised game. Optimisation (in this context) is about making the best use of the resources available, not how much of the available resources you're using or the required amount of resources.

People all too readily declare a game to be "not optimised" just because it is a demanding game. Crysis 3 is a demanding game but is very well optimised (for example).
 
FC3 is actually fundamentally broken though. I cannot get the graphics settings any lower and I still see huge frame dips and stutter. Sometimes I even get treated to an annoying bug that randomly dips to 30fps and stays there until I change a graphic setting. This is on a 3gb 580 and the performance is close to identical on lowest and max. I gave up trying to play it.

Post FX is heavy on the card but it's not the only problem plaguing the game. I've yet to find anyone who got it to stay at 60 either.

It stays well above 60 for me (have a GTX Titan) as long as I set post fx to low. Atleast in DX9 mode, haven't really tried DX11.
 

mclem

Member
You shouldn't lose a lot of performance with streaming? Or do you mean 30/60 FPS etc?

Most things I've found seem to suggest that you really need to run at 720p to ensure that the streaming encoding/decoding can take place in real time. I just about got away with playing BG:EE at full resolution, but some of the cutscenes would be a little on the choppy side. Conversely, Borderlands (a more visually demanding game) was pretty much seamless at 720p
 

Teremap

Banned
This thread is SO deserving of more views.

Thank you for making this thread, Durante. Honestly, this mindset is just so incredibly irritating and damaging to any notion of progress in graphics in games, it needs to stop. I LOVED the fact that Crysis had forward-thinking graphics settings, giving me plenty to look forward to when I would eventually upgrade in the future (and boy, was it worth it!). Imagine my disappointment when it turned out that Crysis 2 was not only compromised on the gameplay side, but even with the DX10/11 patch the game was still not even approaching the level of future-proofing that the original contained. (They somewhat redeemed themselves with Crysis 3, but only just so.)

It's also irritating how many posters seem to think that 'optimization' is some magical cure-all that makes things run faster without an ounce of compromise... and of course, they're wrong. Every time someone throws that word around without qualifying what they mean, I already know their opinion isn't one that's worth reading. What a shame.
 

SapientWolf

Trucker Sexologist
crysis, metro 2033, witcher 2 and any game with either a 4-8x msaa or a supersampling option included in the menu is an example of this.
Crysis 1 scaled well with its graphics settings, nothing unoptimised about that game.



No, they are just options, you contrarian

edit; this guy has to be trolling,, no real effort beyond what the consoles are running?
For starters crysis 1 was made for pc, with no intention to port it to consoles until years later, so your argument falls apart right there (as does any pretense of you knowing what you're talking about)

Crysis was the first game to include proper godrays (sounds boring now but people ogled over them at the time) , first game with such a high quality per object motion blur implementation (that looks incredible even today) , one of the first if not the first to implement PoM (people ogling over the rocks and pebbles looking like rocks and pebbles not a flat texture of some rocks and pebbles)
http://gamingbolt.com/wp-content/uploads/2011/09/2gvkaj8.jpg
this blew people's minds back in 2007
Everything about crsysis 1's graphics was years ahead of anything else at the time.
But just because you heard the 'but can it run crysis' meme a few too many times you call it unoptimised, despite people with old pcs just running it just fine on lower settings and the game scaling extremely well when you lowered settings.

You don't even know what unoptimised means.

I'm not even going to dignify the terrible 'poorly coded/designed' options crap
Even Crytek said Crysis 1 wasn't optimized very well.
 

Vaporak

Member
Even Crytek said Crysis 1 wasn't optimized very well.

Crytek admitted what everyone wanted them to admit so as to try and get a better relationship with their customers. Crysis 1 was actually fairly well optimized, it ran better while doing more technical work than a lot of games even years later. But the average gamer has little to no ability to discern what actually takes a lot of computing resources to achieve, and what takes only a little.
 
This thread feels appropriate to read after seeing a few AssCreed Unity playthroughs. Thought I'd bump it to see if anyone agrees.
 
This thread feels appropriate to read after seeing a few AssCreed Unity playthroughs. Thought I'd bump it to see if anyone agrees.

And now Tomb Raider. Holy hell is it relevant for that game.

EDIT: Just realized that was a 2014 post. Don't care. This thread deserves a bump :p
 

epmode

Member
I still like this thread.

Personally, I love when companies provide max settings that tax even the best machines. So long as the game performs well at reasonable settings, I mean. Going back to games like that when you finally upgrade your hardware is a lot of fun.
 
I set 60fps as my absolute benchmark, all graphics settings are subject to change because of that. All effects can and will be sacrificed in the pursuit of a high and stable framerate.

I would rather that publishers made the same priorities in their literature but I guess literal pretty graphics are an easier selling point than framerates.
 
I never understood this to be honest. There is no standard to max, insane or whatever the hell you wanna call it settings so how can these arbitrary names serve as any basis for comparison?

It would be different if "Ultra" always meant the same exact settings.
 
While I agree on the whole, I don't tend to be too dismissive of this opinion because I think it's a rather natural way for a 'uman to think. I have the highest-end GPU therefore I should be able to run the highest settings. While we all can agree this is counter-productive, it's not exactly a freakishly illogical expectation. At least until you know better, anyway.
 

Alvarez

Banned
You know what's fun? Setting Shadows (or SSAO) from High to Ultra, seeing no visual difference, and losing 30 FPS. Same thing happens with AA sometimes.

I usually keep my stuff on High; with few exceptions, Ultra has too much performance impact for too little return. Many types of AA are just gross-looking, too. I'd rather have jaggies than a blurry mess.
 
And now Tomb Raider. Holy hell is it relevant for that game.

EDIT: Just realized that was a 2014 post. Don't care. This thread deserves a bump :p

Eternal Recurrence |OT| Judging game performance at "max settings" is enormously counterproductive

Seriously people, not everything has to be at ultra!
 

Synth

Member
Eternal Recurrence |OT| Judging game performance at "max settings" is enormously counterproductive

Seriously people, not everything has to be at ultra!

I think the main issue that a lot of PC gamers (especially those that jumped on the PC bandwagon more recently, around the time the 7th gen consoles were completely outclassed) is that there's not much to inform them what shouldn't be set to Ultra. This becomes especially problematic when the performance of the game varies drastically in different areas (like RoTR appears to), so the player settles for a configuration that they spent a good amount of time tweaking near the start of the game, only for it to completely shit the bed later in the game, whilst they may not actually not notice a visual disparity that would cause it to make logical sense to them.
 

MultiCore

Member
I don't judge game performance this way, I judge hardware performance this way.

And boy, what a joke hardware has been, for quite a while now.

Mentioning RotTR, the difference between a 780 Ti and 980 Ti is 60 to 90 FPS. If I can't even get up to 120/144hz by upgrading to the fastest single card on the market, why bother?

Save us, Pascal.


If a game releases that doesn't tax your hardware, you can rest assured they were targeting a broader market or consoles.
 

EctoPrime

Member
Changing Ultra to a year in the future would make it easier for a lot of users to understand as to why their current machine chugs.

Low
Medium
High
2020

Plus nobody can really whine about the setting until the year arrives and by then midrange cards will probably run the game at 200 fps.
 

Synth

Member
Changing Ultra to a year in the future would make it easier for a lot of users to understand as to why their current machine chugs.

Low
Medium
High
2020

Plus nobody can really whine about the setting until the year arrives and by then midrange cards will probably run the game at 200 fps.

Ha, I like this idea actually.

I remember back when DooM 3 came out and Ultra was very much an option you weren't expected to engage at the time. At the time this was easily understood, even by those with very little clue about graphical settings (such as myself). Over time though the "Ultra" setting has frequently taken the role of what would just as easily be described as "High", where a mid-high end card of the time would simply chew through it with ease. Maybe the best thing would be to have more of a standardized understanding of what "Low/Medium/High/Ultra" mean when applied to games in general, so that if two games release at the same time to be run on the same range of hardware Ultra on one isn't something that's a complete no brainer on one, and will still push out 1440p effortlesslt at 90+ fps, whilst one the other game it'll cripple that same hardware down to sub 30fps through many areas of the game. In this case, maybe the former game shouldn't actually have an Ultra setting at all.
 

knitoe

Member
"MAX" settings means setting every graphic settings to their highest. Makes it easy for people to compare which is fine in that context, similar to Low, Medium, High, Very High, Ultra and etc. Of course, most people will run at different lower settings while playing. For example, in Rise of the Tomb Raider, I get 19fps @ max 4K. Obviously, I would never play at those settings, but if someone with a similar hardware says they get 40fps, something is not right. Either my setup is screwy or the other person isn't running at "max" settings.
 

Teeth

Member
I am of the personal opinion that developers should make sure a consumer with the fastest single GPU on the market should be able to maintain solid 60fps when their in-game graphics options are maxed out at a resolution of 1080p as standard (discounting Antialiasing solutions better than FXAA) and then if people want to tweak further let them do so with ini files.

How would they predict that 2 years out?
 

low-G

Member
Just because a graphical technique has been optimized in code doesn't mean a game is optimized. If the result is garbage and the game looks bad, because the actual graphical techniques are worthless, then the game is not optimized. If a game looks bad and runs bad, it's not optimized, period.

Think of it this way: What if a game actually mined bitcoins in the background at an incredible rate. It wouldn't matter if that was the best bitcoin miner in the universe, it's not an optimized game.

I think developers SHOULD hide settings if they're terribly inefficient on modern hardware, I don't mean worthwhile graphical improvements. I'd venture to say that >90% of settings in current PC games are worthwhile, but there's always one or two which basically no human could perceive outside of a screenshot and no sane person would ever care about.
 

sear

Banned
I see your point.

Not everyone is going to max out the game and see it as the "default" experience for majority mainstream userbase that plays a game.

The majority of PCs and people playing PC games aren't running high end GPUs and setting the details to max.

So using that as a measuring stick of what to expect from a performance barometer, would be highly misleading because the general masses wouldn't be getting that same experience that a smaller audience of Enthusiast level PC owners typically consider the norm.
This isn't quite what the OP was talking about.

What he was arguing is that many developers build PC games with settings that you can crank super high so they are future-proofed, or "just because we can", but that it also costs a lot of performance to do so. In other words, what constitutes "max settings" is highly variable and depends on developer intentions. Is "max settings" the "as it was intended/designed" setting, or is it "you are crazy to try this?"

TotalBiscuit and similar PC gaming enthusiasts will eviscerate a game because it doesn't run at 120 fps on a crazy high end setup, even though it's actually not an "optimization" problem, it's that the "ultra" or "extreme" setting or whatever is literally too demanding for any hardware to run at high framerates. There is a level of entitlement there, like "I paid this much for my computer, ergo it should run this game at X performance level".

Meanwhile, games that look good and run at high framerates, but don't necessarily scale well or crank things to "beyond reasonable" levels, are interpreted as being "well optimized" and therefore are praiseworthy. There is actually a very good business reason for developers to hold things back.

The average consumer of games has no idea what "optimization" entails so just assumes it means "does this game run well or not". In reality, unless you are doing something very wrong, optimization of games is a delicate balancing act of sacrifices you make. If people notice too much, you dun goofed. If people don't notice, it means your game runs better with very little to no noticeable image quality loss.

Even GPU drivers from AMD and NVIDIA that are "optimized" for certain games are in reality decreasing image quality; they're just doing it in certain ways for each specific game that are not noticeable (i.e. reducing alpha buffer resolution on transparency effects, reducing color depth selectively, reducing texture filter quality, etc.)

That's not to say there aren't poorly optimized games out there, of course. But it's not nearly as simple as "make your code run better, idiot". Usually more than anything it's a simple intersection of time and money available to keep working on that game, or some sort of fundamental hardware or software bottleneck which is difficult if not impossible to overcome. Example: see Arkham Knight, which ran on a version of Unreal Engine 3 heavily customized for console architecture, and which was supremely bottlenecked on PC as a result, to the point where "optimization" would basically mean "completely rebuild the game from scratch" and therefore wasn't economically viable to do.

As mentioned previously by others, devs need provide an option called "Console Settings" which would actually give us a good idea at how well optimised a PC port is.
As a producer, I can tell you that this is probably a bad idea for PR reasons. Console sales of multi-platform games (at least in the triple-A space) typically make up total sales volume by upwards of 85%. You do not want to invite a PR gaffe where someone says "Look, the console version runs at low settings! And the Xbox version runs at settings even below the lowest! This developer doesn't care about consoles!" Leave it to the press like Digital Foundry to nitpick things if they want; there's no benefit to officially saying what settings the console game runs at.
 

Truant

Member
Good thread.

"Max settings" in game A is not the same as "Max settings" in game B. The very idea of comparing the quality of a port by some arbitrary naming of graphical options is ridiculous.

PC gamers complain when a game looks like their console counterparts, yet when the developers add higher tiers of graphical fidelity, they call it "unoptimized".
 

Durante

Member
They should call "medium" setting "normal", "low" settings "optimized", and high as "for the 1%"
That's not really true.

Everyone can eventually benefit from not arbitrarily limiting maximum settings just so that unreasonable people don't get their panties in a bunch.

Good thread.

"Max settings" in game A is not the same as "Max settings" in game B. The very idea of comparing the quality of a port by some arbitrary naming of graphical options is ridiculous.

PC gamers complain when a game looks like their console counterparts, yet when the developers add higher tiers of graphical fidelity, they call it "unoptimized".
Exactly. But when you mean "idiots" say "idiots" and not "PC gamers", I self-identify as one too ;)

This isn't quite what the OP was talking about.

What he was arguing is that many developers build PC games with settings that you can crank super high so they are future-proofed, or "just because we can", but that it also costs a lot of performance to do so. In other words, what constitutes "max settings" is highly variable and depends on developer intentions. Is "max settings" the "as it was intended/designed" setting, or is it "you are crazy to try this?"

[...]

Meanwhile, games that look good and run at high framerates, but don't necessarily scale well or crank things to "beyond reasonable" levels, are interpreted as being "well optimized" and therefore are praiseworthy. There is actually a very good business reason for developers to hold things back.
Exactly. And that's why I say that it is in fact not just idiotic and tiring, it's also actually counter-productive for enthusiasts.
 

sear

Banned
My personal steps for enjoying PC games:

1. Buy the best hardware you can afford on your budget.
2. Change a game's settings until you can run it at what is to you an acceptable trade-off of performance and image quality.
3. Play the game and stop worrying. If it's fun, it won't matter how amazing it looks.
 

stuminus3

Banned
Couldn't agree more, Durante. Unfortunately it's an argument I've been making for at least 20 years now. It's great that the PC platform is popular, but the amount of FUD and misinformation spread by so-called "PC" gamers across the internet is staggering. I find the general public doesn't even understand what Vsync does yet they're all somehow experts on game performance.
 
I judge all games by Medium or in some cases High.

If it runs on my 860m at a stable framerate with their Medium settings, then I'm set.
I'm not there for graphics. Just for convenience.

(As an example, BLOPS 3 needed to have some settings on High to get the awesome textures, so that game is one of the few where High settings mattered to me.)
 
Vendor specific features should definitely ALWAYS be disabled in these benchmarks. They tank framerate and offer nothing. I've disabled PhysX in literally every game I've ever played. Even when my 670 was new, it would still tank FPS in games and it was a very capable card.

I feel like benchmarks should be done at medium spec with all vendor specific junk off. That would be the best balance of people trying to play the game.

Afterwards, feel free to bump it to high/very high to show off how it holds up.
 
Top Bottom