Tidus_Great
Banned
I think the most important thing is how it runs compared to a console and what price range PC can run it at those settings. If a 500 dollar PC can run a game betterthan an Xbone that's a positive for PC
So at what settings should we judge game performance? (or am I missing the point?)
They were pretty clear on that point. The option is highlighted in red for one and they said in advanced it wasn't meant as an option to run on PCs at that time.
So at what settings should we judge game performance? (or am I missing the point?)
I was thinking about bringing up Crisis 3. Low looks very similar to very high and can run on integrated GPUs.This mindeset has always pissed me off as well. Crysis 3 on low settings looks and runs better than the vast majority of games, but because Ultra is an option it's deemed to have shoddy performance. Why?
For comparing across games, the default settings. For comparing to the console performance, console-equivalent settings.
Of course you can also benchmark the "max" settings, but every time you talk about them you should be cognizant of what they actually entail.
The truly bad part is really "game A runs at 60 FPS maxed, game B at 15, ergo game A is much better optimized than game B". And yes, that is what far too many arguments about performance and optimization on gaming forums boil down to.
No, they are just options, you contrarianoptions that severely cripple performance without substantially improving the visual experience are great examples of poor coding and developer decisions.
crysis, metro 2033, witcher 2 and any game with either a 4-8x msaa or a supersampling option included in the menu is an example of this.
Crysis 1 scaled well with its graphics settings, nothing unoptimised about that game.
No, they are just options, you contrarian
crysis 1 was terribly optimized. they are poorly coded/designed options, especially when theres sane and reasonable ways to improve visuals. but again, no developer is going to put any real effort in beyond what the consoles are running.
100% wrong. We already have examples of games having better settings than what's available in the console skus. More are to come of course.but again, no developer is going to put any real effort in beyond what the consoles are running.
I disagree. Just because you don't happen to notice the higher quality settings does not mean they have not been properly implemented and optimised.options that severely cripple performance without substantially improving the visual experience are great examples of poor coding and developer decisions.
They're not useless at all. You just have to be aware of what, specifically, they are useful for. There are different forms of benchmarks and various testing methods you can use in order to figure out certain things.Well, most of the PC benchmarks are kind of pointless.
You are completely wrong about pretty much everything.people bitch when the 15 fps game only looks on par or worse than the 60 fps game. and in that case the bitching is perfectly justified. its pretty much a industry standard at this point that any settings above and beyond the console settings are pretty much nothing but half ass coded crap worked on only enough so the game still functions. performance optimization is completely ignored. typically unless it can simply be enabled by typing a different word in an ini file or nvidia/amd come in and do the coding themselves there will be no extra pc features.
He's been very anti-PC in his short time here.edit; this guy has to be trolling,, no real effort beyond what the consoles are running?
100% wrong. We already have examples of games having better settings than what's available in the console skus. More are to come of course.
I disagree. Just because you don't happen to notice the higher quality settings does not mean they have not been properly implemented and optimised.
Just an example : fog shadows in Crysis 3. Demanding for sure but the impact on visuals justifies it.
For comparing across games, the default settings. For comparing to the console performance, console-equivalent settings.
Of course you can also benchmark the "max" settings, but every time you talk about them you should be cognizant of what they actually entail.
The truly bad part is really "game A runs at 60 FPS maxed, game B at 15, ergo game A is much better optimized than game B". And yes, that is what far too many arguments about performance and optimization on gaming forums boil down to.
Hence folks saying the last DMC is better "optimized" than Crysis 3....
Cringeworthy.
The truly bad part is really "game A runs at 60 FPS maxed, game B at 15, ergo game A is much better optimized than game B". And yes, that is what far too many arguments about performance and optimization on gaming forums boil down to.
I disagree. They are particularly striking in the case of AC4 for example.yeah better settings that tank performance and are barely distinguishable
If you've got the performance to spare, then its nice to have a little extra detail or flourishes.yeah better settings that tank performance and are barely distinguishable
Fully agreed. The obsession with max settings may be understandable for benchmarks and reviews or for those gamers that want the best image quality possible but for the rest of us, the untrained eye as Maldo described it, the increase in quality when going from High to Ultra is frequently not worth the performance penalty. Mostly stable 60fps takes precedence over any available quality settings. That said, I want those settings there for those who can appreciate them.
And for when I buy a new gpu.
Don't need to be sold any shitty 'remastered editions' of 2-4 year old games on pc thanks to pc gaming having options that scale up on powerful gpus.
Remember when almost all pc games used to have a bunch of super demanding options that would scale to future hardware? Those were good times
And for when I buy a new gpu.
Don't need to be sold any shitty 'remastered editions' of 2-4 year old games on pc thanks to pc gaming having options that scale up on powerful gpus.
Remember when almost all pc games used to have a bunch of super demanding options that would scale to future hardware? Those were good times
Fully agreed. The obsession with max settings may be understandable for benchmarks and reviews or for those gamers that want the best image quality possible but for the rest of us, the untrained eye as Maldo described it, the increase in quality when going from High to Ultra is frequently not worth the performance penalty. Mostly stable 60fps takes precedence over any available quality settings. That said, I want those settings there for those who can appreciate them.
I think semantics are a part of it, yes. As can be seen in the case of Witcher 2. Another thing developers absolutely need to do is document each setting and its expected performance impact (though this can be more challenging than you might expect).Tbh this is a matter of semantics. I vaguely remember Tyrian did it in a cool way back in the day, you had a high quality setting, above that you even had a Pentium setting (Pentiums were quite new) but above that, and this is what I wanna focus on, you had some sort of 'Future' setting.
Agreed. That's another reason why Crysis 3 is so impressive.The thing I'm most impressed by is great scalability. When a game can be a technical show piece on top end hardware but also run respectably on modest hardware, that's when I feel the developers took the time to take advantage of the strengths of the platform.
I do remember that time. Mafia (the first one) was colossally demanding, and that was a great pleasure to play two years later at max settings.Remember when almost all pc games used to have a bunch of super demanding options that would scale to future hardware? Those were good times
Well, that's what Nvidia's Geforce Experience and AMD's Raptr allow you to do. The problem is that neither software gives you a projected FPS target as you adjust the settings. They also don't know when to lower some intensive settings, like Far Cry 3's Post FX.Some settings should have a warning before you turn them on, some game gives you options that can kill the performance for very little return. I don't think they should be hidden, they should just make it clear what it is and the impact if will have on performance.
There also should be an option to automatically tune the game to get to a certain target FPS based on a set of priorities and a benchmark. Right now games rarely offer reasonable settings if you let is chose automatically.
And for when I buy a new gpu.
Don't need to be sold any shitty 'remastered editions' of 2-4 year old games on pc thanks to pc gaming having options that scale up on powerful gpus.
Remember when almost all pc games used to have a bunch of super demanding options that would scale to future hardware? Those were good times
Indeed they were!
The biggest problem i have, is that my options always range from low to very high. I don't know if medium in game A corresponds to very high in game B or the other way around.
I always feel bad about having to turn down my settings because while i am easily satisfied with the graphics of a game, lowering them makes me feel like i'm playing something sub-optimal.
If i knew what settings corresponded to settings in other games, and if i knew which setting i did or don't benefit from with my resolution at 1080p (like for example: a Very high shadow resolution) then i'd be far less "obsessed" with max settings.
Kinda off-topic but I would love a settings screen where you actually had to flip little toggles and pull sliders.Another good suggestion for graphics settings (hidden on a configurable .ini or not) is that anything that is scalable should have a multiplier value, and anything that is a toggle should be a toggle.
Every time I see a performance analysis of Metro Last light which says modern cards struggle at 1080p, when they set SSAA to max, I cry a little.