• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Judging game performance at "max settings" is enormously counterproductive

Mohasus

Member
this is something that i thought about in-depth a long time ago. its not fun to turn down your graphics settings, and it can easily compromise the experience that the artists worked so hard to evoke

if you ask me, i feel like all games should run smoothly with maximum settings on any computer that costs no more than $500. of course, that couldnt be any further from reality, but the idea of being unable to enjoy a games ideal visuals without building a $1000 computer is absurd to me. not only does it become a matter of some people having to play with massively toned down visuals, but this also bars a population of people from even being capable of playing your game

i feel like if you cant get around requiring such intense hardware, you should be using a more performance-friendly, stylized approach to visuals. not everything has to be hyper-realistic, you can make your game look really charming and save on performance at the same time

in my dreams though, smh

You don't need to think that a game can only be enjoyed at max settings, otherwise no one would be able to enjoy console games. Also, the difference between medium and ultra nowadays is very very far from massively toned down visuals, they still look damn good. I say this as someone with a GTX970 and a 1440p display, in other words, I can't max most recent games.

What you said would just massively hold back graphical settings and cards, there would be no need for a high end card or new expensive options. I love HBAO+ because some SSAO implementations were terrible, but it made my FPS drop so much when I had just a GTX 660. People are already complaining that the CPU and GPU market are stagnant, imagine if there was no reason to improve because that power wouldn't be used.

What you are suggesting would be like "hey guys, rename medium to ultra and call it a day, no point putting some effort in the game, that will just bite in the ass later", which is kinda true even today.
 

belmonkey

Member
It's pretty annoying trying to look up 4k benchmarks for games when literally every video is "4k max settings", then you're told how unplayable 4k is.
 

Bear

Member
It's nice to see this bumped in light of the discussion surrounding Deus Ex: MD.

Ultimately I think people need to realize that it's a good thing when developers keep future hardware in mind when they design games. Technology is always improving so we should be encouraging developers to explore more advanced settings that aren't locked down to contemporary hardware. As long as the game can run well on common hardware while meeting current visual standards, they shouldn't be criticized for going beyond that and including more sophisticated features for PCs that can support it.

Now, I can't comment specifically on Deus Ex but the knee-jerk reaction to its performance is definitely an unhealthy mentality regardless of how optimized (or not) the game is. Context is essential when evaluating a game's performance and it's irresponsible to jump to conclusions without it.
 
To be fair - while "optimization" is broadly misunderstood(and misused) by users of every platform, PC market has actively promoted shifting responsibility for performance to the user for last 2 decades, and it's inevitable that people who didn't grow up with that phase or are new to the platform would not understand that they are in fact, intentionally empowered to play with options.
But on the flip-side, if you want an open platform with no rules, you inevitably have to accept that "no rules" applies to the user perceptions as well, and if someone expects max-settings to work great out of the box they aren't necessarily any less in-the right about it than anyone else. :p

That's quite the jump in logic. "Allowing total user freedom" is not "the user is not wrong if he says the sky is green".
 

KingBroly

Banned
When devs release minimum and recommended specs, they should tell me what those specs give me, in terms of resolution, framerate (VERY IMPORTANT) as well as other things.
 

RealMeat

Banned
Couldn't agree with this more. I get that when you just paid $500+ on a video card you want to just turn up all the sliders and not worry about it, but it'd be a shame to discourage developers from future proofing their games.
 

pronk420

Member
This is a really good point, I remember someone posting on here a while ago asking why Crysis was so poorly optimised because they couldn't run it on 'max' whereas other games could. Some people seem to think 'max' settings is like some kind of universal setting whereas in reality it varies massively from game to game.
 

Fafalada

Fafracer forever
Weltall Zero said:
That's quite the jump in logic. "Allowing total user freedom" is not "the user is not wrong if he says the sky is green".
But they're not.
"Max" settings in of themselves are a made-up term on an open platform with no hardware standards. Arguing that it's logical to expect them to be one way or another is silly - the only expectations are those set by the market at large.
 

Durante

Member
To be fair - while "optimization" is broadly misunderstood(and misused) by users of every platform, PC market has actively promoted shifting responsibility for performance to the user for last 2 decades, and it's inevitable that people who didn't grow up with that phase or are new to the platform would not understand that they are in fact, intentionally empowered to play with options.
Right, that's why they should learn, and try to put a minimum of thought on a simple analysis of what their expectation will actually encourage in terms of developer behavior.

That's the point of this thread.

I'm still baffled that it seems not to be, but then again I'm constantly baffled by what people don't understand.
Perhaps an analogy would help them understand? Like, um... having two hair driers, one that can go up to 1000 watts and another that can go up to 2000 watts, and complaining that the latter is worse because it consumes more at max power? I'm sure people can come up with better analogies.

I've been reading up on rationality lately and I think this is a clear cut case of affect heuristics:
That's interesting. Seems quite closely related to how some people on this forum seem to feel better about playing a game on console with a given unchangable IQ and performance, rather than on a PC if they can't "max" that version -- even though they could run it at higher performance than the console version.

It's behavior which I have an extremely hard time wrapping my head around.
 

Momentary

Banned
"If you can't max out a game at 4K60 then you don't have a 4k machine and/or the game is un-optimized."

What a load of crap. I enjoy a ton of games on my 980m at 4K60.

Zestiria, Disgaea, Neptunia, Furi, DARIUSBURST, and countless other games. I can even max some.of them out. Hell, I'll be using my Titan to downsample from 8K.
 

Sotha_Sil

Member
Ive often wondered what a prospective build with a 6600k and a 480 could do on most games at just high settings instead of ultra. I imagine everything woild be 1080p/60fps.
 

TheSeks

Blinded by the luminous glory that is David Bowie's physical manifestation.
So at what settings should we judge game performance? (or am I missing the point?)

1080p. The setting most people (read: Consoles and low end) are gonna be targeting.

A lot of PC bnenchmarks go with higher resolutions. Of course low end (1070) cards aren't gonna hit 60FPS on Ultra with those.
 

Jedi2016

Member
I knew a guy years ago that was bitching about how some game or other "sucked" because he couldn't run it at max settings at 1080p/60. When I mentioned that he could turn some settings down, he said, and I quote: "I shouldn't have to do that."

I made an earlier point about trade-off, and it's something I still stick to. I have a GTX-1080 now, so maybe I shouldn't "have to" turn settings down, but now that I think about it, I do have The Witcher 3 maxxed out... except for Hairworks, because the performance hit just isn't worth the improvement that I probably wouldn't notice anyway.

My performance ceiling has also gone up since then... it used to be 1080p/60, now it's 1440p/144. I could turn everything in TW3 down to Medium/Low and actually get the game to hit 144fps, but I won't. Does that mean the game is "unoptimized"? Of course not.
 

joms5

Member
The way I see it, if a developer wants to "future proof" a game, or put in options that allow it to look better in the coming years, that should be labelled as such in the options.

For example, if you choose to put in a setting called MAX perhaps a prompt should tell you that the settings will not run optimally on any card lower than a _______.

If a game cannot run at a stable framerate at it's highest settings (meaning the the highest available setting that can properly run on current gen hardware) on the best current gen hardware. I think that's a problem.

Or to avoid the labeling confusion maybe just future proof it with a patch at a later date?
 

Durante

Member
The way I see it, if a developer wants to "future proof" a game, or put in options that allow it to look better in the coming years, that should be labelled as such in the options.

For example, if you choose to put in a setting called MAX perhaps a prompt should tell you that the settings will not run optimally on any card lower than a _______.

If a game cannot run at a stable framerate at it's highest settings (meaning the the highest available setting that can properly run on current gen hardware) on the best current gen hardware. I think that's a problem.

Or to avoid the labeling confusion maybe just future proof it with a patch at a later date?
I think that's a lot of hoops you require developers to jump through just because people are being irrational.
 
I think that's a lot of hoops you require developers to jump through just because people are being irrational.

Agreed. I also thought it worked out quite while for Crysis, keeping that game bought, played, and talked about for years after release. Didn't really matter that it couldn't be "maxed" at release. That's part of what made it awesome.
 

Mike Golf

Member
When devs release minimum and recommended specs, they should tell me what those specs give me, in terms of resolution, framerate (VERY IMPORTANT) as well as other things.

Agreed, the provided specs for minimum and recommended performance tells us nothing without accompanying resolution and average framerate expectations. If that info was provided and was accurate customers could more quickly make an informed purchase rather than having to scour mutliple threads and articles on different sites trying to parse out what their machine's expected performance is going to be.

Of course providing a demo for every game which includes a benchmarking tool would be the ideal fix and would have the benefit of the developers not having to do in depth benchmarking for the spec lists themselves and/or paying a third party to do it.
 

Parsnip

Member
It is interesting how cyclical this seems, I guess there are always new PC gamers whose mindset may not be tuned to how PC gaming "works".
 

Fafalada

Fafracer forever
Durante said:
I think that's a lot of hoops you require developers to jump through just because people are being irrational.
Any sort of option menus are jumping through hoops to begin with - but it's done because doing it entirely through .ini files is bad UX, that only select segment will put up with.
Which perhaps is the real answer - leave the more extreme settings out of the graphical interface, it will keep the audiences who are likely to complain about them from complaining while still having the option for others.
 

jrcbandit

Member
I get so annoyed with pc gamers who hold back the potential graphic options we could have just because of the prevalent I should be able to max everything line of thinking.

Seems like the best thing for developers to do would be to give "reduced" graphic features for ultra to please those idiots and ensure 60+ fps on middle to high end current cards, then have an experimental mode to unlock extra options that are meant only for SLI or video cards that don't exist yet. When you click to unlock experimental mode, just have a warning these options are meant for multiple GPUs or future hardware.
 

joms5

Member
I think that's a lot of hoops you require developers to jump through just because people are being irrational.

Is it though?

I know nothing about game development so I admit I am 100% ignorant about how much work goes into menu design and execution. But if a developer has the idea to make a game future proof and include graphical options that will pay off years later, isn't it just making the menu's change the .ini?

The patch idea I can understand would be quite a bit more work.

And you're right it is an irrational thought of players, but then how do we change the thought process that having anything other than the highest graphics options is somewhat of a lesser experience?
 

Kudo

Member
There should be specs that let you run the game to its fullest potential, sometimes devs use "Recommended" for this, most of the time not.

I'd rather have the game optimized for currently existing video cards than ones that do not. Of course it's up to developers whether they leave experimental options that do not work on current hardware but I can't blame the gamers for getting annoyed about it, many of the people I know are "max or nothing" towards PC gaming.
Personally I'm just little emotional that the low end card I bought week ago is already electronic waste and can't even run 1440p@Ultra for Deus Ex.
u31eYRy.png
 
Is it though?

I know nothing about game development so I admit I am 100% ignorant about how much work goes into menu design and execution. But if a developer has the idea to make a game future proof and include graphical options that will pay off years later, isn't it just making the menu's change the .ini?

The patch idea I can understand would be quite a bit more work.

And you're right it is an irrational thought of players, but then how do we change the thought process that having anything other than the highest graphics options is somewhat of a lesser experience?

Yes it is asking for a lot.

- Adding labeling for very performance intensive features.
- Add a prompt for on what it will run, which means a lot of testing to make sure of that.
- Doing it separately in a patch also costs a lot of effort and even then you are screwing out people that have for example Titan X SLI at launch.

And even with all those points there are too many other questions to ask. Is it in combination with other future proof settings? On what resolution? If SLI of the best cards is enough isn't a prompt needed? What is the acceptable performance level?
 
it stifles game design efforts when we have to spend resources to figure out how to address these types of problems

even if I go through the considerable effort of adding an optional feature to let people rent some cloud resources to make up for a weak system, i imagine that it would backfire because the game would somehow get perceived as 'unoptimized' instead of 'scalable'
 

Stumpokapow

listen to the mad man
I think the tendency you point out, Durante, is very unfortunate because it will lead to devs being gunshy about leaving head-room in the engine. One of the great things about Crysis 1, for example, is that even though no one could get acceptable performance at max settings on day one, over the next few years PCs "grew into" the game. This creates a form of "backwards compatibility" where older games actually look and get BETTER over time, which is great.

But if the headline is "WTF my $1000 gaming rig won't run this at 8K supersampling with God Tier textures at 144fps on day one, ugh, lazy devs" then they'll just respond by throttling settings low enough so that the headlines become positive. Which is a pity, because 10 years from now, it'll be great to play that game in 8K supersampling with God Tier textures at 144fps and now there won't be an option internally without mods.
 

Coolade

Member
Oddly enough the games coming out on UWP have been the most interesting in terms of graphics options for me lately. Forza Apex 6 allowing you to set a target framerate and have the game dynamically auto adjust all of the games settings to ensure you hit your target framerate along with resolution scaling, it's a pretty neat idea.

Furthermore, the new Gears of War 4 graphics options have settled on rating GPU Impact, VRAM Impact, and CPU impact of each setting, as well as a brief description.
 

Parsnip

Member
Oddly enough the games coming out on UWP have been the most interesting in terms of graphics options for me lately. Forza Apex 6 allowing you to set a target framerate and have the game dynamically auto adjust all of the games settings to ensure you hit your target framerate along with resolution scaling, it's a pretty neat idea.

It is.
Does it go one step further and allow you to lock certain settings? Like, try to keep me at 60, don't touch SSAO, everything else is fair game?
 
Hahaha, I thought this thread was made today after skimming through the Deus Ex MD review thread and seeing some really flawed reasoning being used to judge how good the PC port of the game was.

This thread, specifically Durante was heavily quoted recently on reddit in a discussion on PC optimization and what it means. I don't really go to reddit very often but one of my friends pointed it out and it seemed to have bled over into 4chan by proxy as well. This is a really important topic to cover, now more than ever with PC really picking up steam... How "poor" a PC port is because of misconceptions can be seriously damaging to perception. I know I have been guilty of this (even scolded by Durante before). It's never been more prevalent and obvious to me now this is a serious problem. Especially watching side by side footage of games graphical settings and the impact on performance it has for often such negligible IQ improvements.

I think the onus going forward sadly needs to be on developers to be more clear about the actual difference some or their graphical settings actually make. (Nvidia with their rundown on the graphical settings providing easily comparable photos and showing performance impact etc. does a good job of this.) Media needs to be more understanding of this and actually promote and talk about the desired settings for greater performance and how much or little it actually effects IQ. Lastly and the most challenging is for consumers to be more aware. Posts like Durante's are helping to make something that should be common sense gain a foothold, but it needs to be a concerted effort by our gaming communities to be better about this.
 

TheSeks

Blinded by the luminous glory that is David Bowie's physical manifestation.
If a 1070 is low end now consoles are calculator tier.

GTX 1070 low end. That's something.

I meant low-end investment for next-generation cards. The point is, 1070 isn't going to be pushing 60FPS past 1080p (more like... 2080 or so?), so people expecting their PC's to pump out 4K resolutions at Ultra on those sorts of cards are running a fools errand.
 
It's great when devs push the envelope and make games that can't be maxed out on current hardware like Crysis was previously.

It's bad when the game still looks kind of average at unplayable settings like Deus Ex: MD. If you're going to throw in GI and SSAA and PCSS at least do it on a base game with decent quality assets.
 
I meant low-end investment for next-generation cards. The point is, 1070 isn't going to be pushing 60FPS past 1080p (more like... 2080 or so?), so people expecting their PC's to pump out 4K resolutions at Ultra on those sorts of cards are running a fools errand.

My 980ti which is comparable to a 1070 pushes 1440p at 60+fps on every game with a mix of max and high settings. Some settings aren't worth the performance hit. I don't think anyone with any single card on the market, 1080 included are expecting 4k/60, so I'm unsure as to what point you are trying to make.
 

Brashnir

Member
1080p. The setting most people (read: Consoles and low end) are gonna be targeting.

A lot of PC bnenchmarks go with higher resolutions. Of course low end (1070) cards aren't gonna hit 60FPS on Ultra with those.

Ah yes, PC gaming GAF. Where insane shit like a 1070 being a low-end card is actually spoken as if it's even remotely true.
 
1080p. The setting most people (read: Consoles and low end) are gonna be targeting.

A lot of PC bnenchmarks go with higher resolutions. Of course low end (1070) cards aren't gonna hit 60FPS on Ultra with those.

In what world is the 1070 a low end card? I'm missing the sarcasm right? I must be.
 

Kaleinc

Banned
The best way is to have low, med, high presets including custom settings and ultra preset which becomes available after adding -advanced_settings launch option.
 
1080p. The setting most people (read: Consoles and low end) are gonna be targeting.

A lot of PC bnenchmarks go with higher resolutions. Of course low end (1070) cards aren't gonna hit 60FPS on Ultra with those.
Wow...


So... The hierarchy goes low end straight to enthusiast with no tiers between huh?

Even funnier when you realize that the 1070 is at the lower end of the enthusiast tier...

I guess none of the high end, mid tier, or actual low end cards exist in your world.
 

dlauv

Member
The standard to which a port should be judged is by the console settings. If it runs the console version's settings at 1080p60, then it should be a good port. The fact that you can go higher than console settings almost always is a reason why I play PC games.

Whining about a port's quality because you can't run ultra as well as high is insulting to my brain.
 
Top Bottom