• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Would you mind graphics options on consoles if it let you play at higher fps?

I would buy a new console if it had exactly the same hardware as current gen, but mandated that all games must run at 60fps to pass certification.

I have no problem with graphics options on console games, especially if it allows you to remove vomit filters, piss filters, film grain, motion blur, depth of field, ect...
 
If you want to get to 60 you are going to have to do work.

Ah yes, work. Effort. Engagement. You talk like developers doing what they're supposed to, working on games, is a bad thing. And don't say they have to put this effort into making game run well on one setting. They are supposed to put effort into making GOOD GAMES, you know, the ones that are fun, and no, don't say that the direction gaming is going is bad with bad design choices, over-reliance on cutscenes etc., I agree with that, but that's not on topic here. Some gamers, and I'm talking a significant amount of them, wand smooth 60 fps gameplay, and they cannot be arsed to play at 30 fps. That is their choice, I understand them. So if some company gives an option to tweak graphical options, so that you can either not bother with them and play at 30 fps if you don't care, or tweak them a bit and get those nice 60 fps, then people who are okay with how things are now will buy the game regardless of that (assuming it's good, and they want it, but I already said we're not discussing that), and the 60 fps crowd will also be satisfied with the product. More sales, voila. They just need to put some work into it, but that should be out of question, because they're supposed to do it regardless, they get paid for it.
 
I already hate that you have to do it on PC for most games. Games should either be optimised for a platform or do the configuring for me. And I'm not talking about some shitty configuring that only takes the raw specs of the computer into account. PC-games need some real prestanda tests so that there is no need for me to tweak the settings in the slightest if i don't want to. The automatic configuration that exists today always set the bar to high, making the game slow down and/or stutter.
If this is perfected, I wouldn't mind if they removed graphical settings all together (Besides brightness & contrast settings).
 
PC-games need some real prestanda tests so that there is no need for me to tweak the settings in the slightest if i don't want to.

I think you ask for impossible, like really? They have to test it on every possible configuration, every OC you can possibly make? You know how many possibilities there are? Milions is pretty much a safe guess here, yes that's how big the hardware market is, and if you didn't realise that, now maybe you take that to consideration and rethink your request.
 
This is one of the silliest "feisty counter argument" moments I've seen in awhile.

Okay. It's good that you're around to provide your invaluable insight into the subject at hand.


The jump from turning off v-sync in Bioshock alone was significant enough.

And when it comes to AA options, those start stacking up.

Then there's scaling down resolutions, both full screen and texture res...

Then you get into motion blur and the stuff you're taking about...

It adds up. 30-60 is very possible, depending on the game.

Bioshock is an argument? bioshock on unreal 2.5?

Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?

I still hold that if you want an acceptable looking 60fps game you need to have 60fps as a target, not 30fps as a target and then slash and burn until you hit 60.


Ah yes, work. Effort. Engagement. You talk like developers doing what they're supposed to, working on games, is a bad thing. And don't say they have to put this effort into making game run well on one setting. They are supposed to put effort into making GOOD GAMES, you know, the ones that are fun, and no, don't say that the direction gaming is going is bad with bad design choices, over-reliance on cutscenes etc., I agree with that, but that's not on topic here. Some gamers, and I'm talking a significant amount of them, wand smooth 60 fps gameplay, and they cannot be arsed to play at 30 fps. That is their choice, I understand them. So if some company gives an option to tweak graphical options, so that you can either not bother with them and play at 30 fps if you don't care, or tweak them a bit and get those nice 60 fps, then people who are okay with how things are now will buy the game regardless of that (assuming it's good, and they want it, but I already said we're not discussing that), and the 60 fps crowd will also be satisfied with the product. More sales, voila. They just need to put some work into it, but that should be out of question, because they're supposed to do it regardless, they get paid for it.

I don't have any problem with cut scenes, you really need to stop going off on tangents. When I said it would take work what I meant was it would take work.... that could otherwise be used to improve a 30fps mode. I don't understand at all why I shouldn't be allowed to talk about graphical focus. If you want smooth 60fps all the time then go spend 3 grand on a new PC and enjoy. Mine's an i7-970@4ghz with 2 OCed GTX 570s
 
Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?

So giving a choice to people makes you lazy. Well, I'm sorry, I'm at loss of words here, post like this make me feel bad. Like really bad. I'm seriously concerned about the future of gaming.
 
So giving a choice to people makes you lazy. Well, I'm sorry, I'm at loss of words here, post like this make me feel bad. Like really bad. I'm seriously concerned about the future of gaming.

I'm not saying it would be lazy I said people would call them lazy, they would be perceived as to not having put effort into their 60fps mode, which would lead to some individuals calling them lazy because as we've seen from this thread some people have no problem calling devs lazy for no good goddamn reason at all.
 
I don't mind a check box here or there but I'd rather the dev take the time to find the balance. Disabling vsync in American nightmare introduced a ton of tearing but the frame rate was much much better. I think I kept it at the smart (middle) setting.
 
Okay. It's good that you're around to provide your invaluable insight into the subject at hand.




Bioshock is an argument? bioshock on unreal 2.5?

Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?

I still hold that if you want an acceptable looking 60fps game you need to have 60fps as a target, not 30fps as a target and then slash and burn until you hit 60.




I don't have any problem with cut scenes, you really need to stop going off on tangents. When I said it would take work what I meant was it would take work.... that could otherwise be used to improve a 30fps mode. I don't understand at all why I shouldn't be allowed to talk about graphical focus. If you want smooth 60fps all the time then go spend 3 grand on a new PC and enjoy. Mine's an i7-970@4ghz with 2 OCed GTX 570s
If you spend more than $1500 on a PC you're being extravagant.

Also, CoD still sells even though it runs at 600p. Doesn't seem like people care that much about resolution.
 
I'm not saying it would be lazy I said people would call them lazy, they would be perceived as to not having put effort into their 60fps mode, which would lead to some individuals calling them lazy because as we've seen from this thread some people have no problem calling devs lazy for no good goddamn reason at all.

I know what you meant, and I'm sorry for not making it clear, I'm just fucking depressed that gaming, my passion that's going on for 12 years, is going down, like fucking DOWN.
 
People are probably going to laugh, but I'd suggest console games offer a single video settings toggle in options. Call it 'Fast/Pretty'.

Fast: Optimized for speed, minimal effects. Guaranteed to run at a framerate of 60 FPS, though this may entail reduced shader quality/lighting/screen resolution, tearing, etc., depending on how much needed to be turned off in order to hit that target.

Pretty: All effects enabled. Default setting for games.

Keep it nice and simple.
 
Okay. It's good that you're around to provide your invaluable insight into the subject at hand.




Bioshock is an argument? bioshock on unreal 2.5?

Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?

I still hold that if you want an acceptable looking 60fps game you need to have 60fps as a target, not 30fps as a target and then slash and burn until you hit 60.




I don't have any problem with cut scenes, you really need to stop going off on tangents. When I said it would take work what I meant was it would take work.... that could otherwise be used to improve a 30fps mode. I don't understand at all why I shouldn't be allowed to talk about graphical focus. If you want smooth 60fps all the time then go spend 3 grand on a new PC and enjoy. Mine's an i7-970@4ghz with 2 OCed GTX 570s
$3k? Bahwahaha. You coukd easily get that done with $1k
 
People are probably going to laugh, but I'd suggest console games offer a single video settings toggle in options. Call it 'Fast/Pretty'.

Fast: Optimized for speed, minimal effects. Guaranteed to run at a framerate of 60 FPS, though this may entail reduced shader quality/lighting/screen resolution, tearing, etc., depending on how much needed to be turned off in order to hit that target.

Pretty: All effects enabled. Default setting for games.

Keep it nice and simple.

That would be good for start if you ask me. If it gets well-recieved, you would expand on this idea slowly, but gradually, by adding more options.

$3k? Bahwahaha. You coukd easily get that done with $1k

I got that done with 600$ (CPU, GPU, memory, motherboard, HDD, and power)
 
So because you wouldnt want to tweak it nobody else should be able to have that option?

If it becomes the norm so be it, but it wouldn't be for the best, devs would abuse it and take even more shortcuts. There is a world of difference between pc's and consoles in this regard, and everyone being overdramatic about how strange/insecure/bizarre/etc. people who don't like the idea are, are simply being extremely disingenuous
 
This thread has got me thinking on something related to PC games that I hate.

Why can't there be an option to say "I want 60fps in this resolution, make it so". Surely it wouldn't be that hard for a game to run a quick torture test, take the average FPS, change a few settings automatically (ranked by fps impact / visual quality loss) and then redo the torture test until complete?

That's the one aspect of PC gaming that I really, really hate. I just want it to run at my native resolution and at a certain minimum fps whilst still retaining as much of the eye candy as possible. Is that too much to ask for? The developers obviously know far better than me what scenes in the game represent the average / peak loads that you would expect to see.
 
BF 3 is also guilty.
I remember a screenshot from a presentation with the Tunnel map.
Where it showed a frame with pre post-processing and post post-processing(color correction).




We aren't saying that devs go with the lowest setting.
I expect that devs release a game where everything that can be on will be on and it will run 30fps or 60fps.

Some people can get sick or nausea from certain effects like a bad motion blur.
For those people if they can set motion blur off the game becomes more enjoyable for them.
Or a dev fucked up a post processing AA methode that destroys IQ instead of improving it.
Wouldn't be better to put that post processing filter off.
And it would probably net you a more stable framerate.
The game will still be V-synced so fps will always be capped at 30 or 60fps.


I think some very specific graphic options are possible. However, it's added debug time that nobody wants to take on the industry. So yeah, in another world. :/
 
q[Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?]q

You are aware that massively popular games like CoD and Halo don't actually render at 720p, right? That doesn't stop them from selling millions. It's pretty obvious that a lot of gamers do care about performance more than image quality. A simple option to select your render resolution (600p, 720p, 1080p) would go a long way in letting console gamers optimize their experience. The ability to adjust AA, AF and basic screen filters (film grain, motion blur, etc) would go even further without requiring any significant amount of technical knowledge from the player.

It really shocks me that people are against the idea of having choice and being able to tailor their experience to their own preferences. I didn't realize the mentality between PC gamers and console gamers was so vastly different.
 
This thread has got me thinking on something related to PC games that I hate.

Why can't there be an option to say "I want 60fps in this resolution, make it so". Surely it wouldn't be that hard for a game to run a quick torture test, take the average FPS, change a few settings automatically (ranked by fps impact / visual quality loss) and then redo the torture test until complete?

That's the one aspect of PC gaming that I really, really hate. I just want it to run at my native resolution and at a certain minimum fps whilst still retaining as much of the eye candy as possible. Is that too much to ask for? The developers obviously know far better than me what scenes in the game represent the average / peak loads that you would expect to see.
This complaint makes no sense.
 
This thread has got me thinking on something related to PC games that I hate.

Why can't there be an option to say "I want 60fps in this resolution, make it so". Surely it wouldn't be that hard for a game to run a quick torture test, take the average FPS, change a few settings automatically (ranked by fps impact / visual quality loss) and then redo the torture test until complete?

That's the one aspect of PC gaming that I really, really hate. I just want it to run at my native resolution and at a certain minimum fps whilst still retaining as much of the eye candy as possible. Is that too much to ask for? The developers obviously know far better than me what scenes in the game represent the average / peak loads that you would expect to see.


It's really not that simple. Most PC games already detect "optimal" settings for your hardware, but these settings are rarely optimal because there are so many variables to consider. There's no way for the developers to know what you're willing to sacrifice in terms of image quality. There's also no way for them to know which features are the most demanding on your specific hardware.

There was a game called Sacrifice that let you specify a desired framerate and the game would adjust texture resolutions and mesh LOD in real-time to achieve that. Unfortunately, that resulted in a lot of weird deformation.
 
Blurrier...but smoother...

Jaggier...but sharper...

Blurrier...but smoother...

Jaggier...but sharper...

Blurrier...but smoother...

Jaggier...but sharper...

This is a veritable Sophie's Choice. How am I supposed to make this decision???
 
q[Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?]q

You are aware that massively popular games like CoD and Halo don't actually render at 720p, right? That doesn't stop them from selling millions. It's pretty obvious that a lot of gamers do care about performance more than image quality. A simple option to select your render resolution (600p, 720p, 1080p) would go a long way in letting console gamers optimize their experience. The ability to adjust AA, AF and basic screen filters (film grain, motion blur, etc) would go even further without requiring any significant amount of technical knowledge from the player.

It really shocks me that people are against the idea of having choice and being able to tailor their experience to their own preferences. I didn't realize the mentality between PC gamers and console gamers was so vastly different.

Yes, I am. You are aware how much shit gaffers talk about this gen not actually being HD. Yes? I also think it's pretty clear that there are no titles that offer a direct comparison between performance and image quality so trying to suggest that "it's pretty obvious that a lot of gamers do care about performance more than image quality" rings hollow.
 
What's not to understand, there are so many variables to getting the best out of a game with PC gaming, having the ablity to simplify the process is always a good thing.
Exactly. I really don't want to sit there messing around with settings for an hour to get the perfect balance. The dev knows exactly what would be a good scene to run a torture test with, and which options would have the best effects on the frame rate.

Game runs tests, changes settings (other than the ones I've set as fixed), tests again (etc), job done.

I really, really hate the first couple of times I play any moderately hardware challenging PC game for the simple fact that I know I'll be in and out of the settings until I can get it running relatively smoothly whilst not having to have it look like shit. Doesn't help that I'm very sensitive to tearing and frame drops.

I don't see why it would be so much to ask for to have it do a bit of automated testing whilst I'm doing something more enjoyable / productive. Especially as many of these games already have benchmark demos / scenes built in anyway.
 
What's not to understand, there are so many variables to getting the best out of a game with PC gaming, having the ablity to simplify the process is always a good thing.
I don't say I didn't understand what he's asking, I said it makes no sense.
He's asking for something that would take an insane amount of job IF possible at all -exactly *because* there are so many variables- just to spare to himself the "hassle" of working a couple of minutes with the settings of a game.
 
Why would it take an insane amount of time? As a developer myself I could add something along those lines in a matter of days, even quicker than that after the first time I did it. There is a very limited subset of settings the demo would need to take account of and they've clearly already got a reasonable idea of what they may be with some of the suggested settings (medium, high etc).

...and I wish it was just a couple of minutes. More often than not the settings I start with end up being too high 10 minutes into the game, and then those settings are wrong half an hour into the game... all because I hadn't yet hit a scene representative of the loads the game would be putting on the hardware on average.

Maybe you've got ridiculous hardware, so it's set and forget, but for me it takes a hell of a lot of tweaking for some games unless I want to have tearing or framedrops all over the place. It's one of the reasons that I play far more games on my consoles... at least at the start of the gen when they're comparable in power to PCs. Towards the end of each gen (now) I find myself back on PCs, and annoyed at the amount of tweaking required.
 
Why would it take an insane amount of time? As a developer myself I could add something along those lines in a matter of days, even quicker than that after the first time I did it. There is a very limited subset of settings the demo would need to take account of and they've clearly already got a reasonable idea of what they may be with some of the suggested settings (medium, high etc).

...and I wish it was just a couple of minutes. More often than not the settings I start with end up being too high 10 minutes into the game, and then those settings are wrong half an hour into the game... all because I hadn't yet hit a scene representative of the loads the game would be putting on the hardware on average.

Maybe you've got ridiculous hardware, so it's set and forget, but for me it takes a hell of a lot of tweaking for some games unless I want to have tearing or framedrops all over the place. It's one of the reasons that I play far more games on my consoles... at least at the start of the gen when they're comparable in power to PCs. Towards the end of each gen (now) I find myself back on PCs, and annoyed at the amount of tweaking required.

I completely understand what you are saying. I get more joy now in playing games. I don't want to worry about compatibility between hardware and drivers anymore. If there was a steam box, I'd buy it in a second. Nearly all of my computing at home is on Mac or iOS and the thought of building a windows box to play diablo 3 is stressing me out.
 
Kingdom Hearts PSP has options too. You can increase the color depth from 16 to 32-bit, sacrificing performance, but counter it by increasing the CPU speed from 222mhz to 333mhz, at the expense of battery life.
 
Yes, I am. You are aware how much shit gaffers talk about this gen not actually being HD. Yes? I also think it's pretty clear that there are no titles that offer a direct comparison between performance and image quality so trying to suggest that "it's pretty obvious that a lot of gamers do care about performance more than image quality" rings hollow.

It doesn't ring hollow at all. If Activision said that the next CoD would run at 30 FPS, 720p instead of 60 FPS, 600p, what do you think the average fan reaction would be? I suspect it would be outrage. Gamers have plenty of reference to compare image quality and performance. When you compare CoD or Halo to games that actually render at 720p, there's a pretty notable difference. Upscaling inevitably results in a blurrier image with more obvious aliasing but most console gamers are willing to accept that in return for 60 FPS. There's a pretty huge difference between 60 FPS and 30 FPS and plenty of reference for both.

If console gamers really cared about image quality, they wouldn't be console gamers. They'd be PC gamers. However, everyone cares about performance, especially in a competitive multiplayer shooter.
 
Why would it take an insane amount of time? As a developer myself I could add something along those lines in a matter of days, even quicker than that after the first time I did it. There is a very limited subset of settings the demo would need to take account of and they've clearly already got a reasonable idea of what they may be with some of the suggested settings (medium, high etc).

...and I wish it was just a couple of minutes. More often than not the settings I start with end up being too high 10 minutes into the game, and then those settings are wrong half an hour into the game... all because I hadn't yet hit a scene representative of the loads the game would be putting on the hardware on average.

Maybe you've got ridiculous hardware, so it's set and forget, but for me it takes a hell of a lot of tweaking for some games unless I want to have tearing or framedrops all over the place. It's one of the reasons that I play far more games on my consoles... at least at the start of the gen when they're comparable in power to PCs. Towards the end of each gen (now) I find myself back on PCs, and annoyed at the amount of tweaking required.

Tweaking isn't "required" for the vast majority of PC games. You only feel compelled to do it because you actually have the ability to do so. Framerate drops, tearing and other visual/performance issues are commonplace in console games. The difference is that there's nothing you can do about it. I don't really see how that's better than letting the player tweak their settings and resolve issues themselves.
 
This thread is amazing. We have people saying PC developers should put in work in order to make games run at some standardised parameters for each and every permutation and combination of PC component parts. And we have people looking after precious console developers lest they be made to work to incorporate a few graphics settings.
 
worst idea ever?

you have devs doing unplayable high detail modes simply for a box shot or too.

but what you'd actually eb playing is some grossy watered down version.



DO NOt WANT
 
I really don't think console devs need to be given the option not to properly optimize their games. I'm pretty sure that would not end well.

So no. Leave that for the PC games.
 
Bad idea, v-sync option should be the most for consoles, if you want all those options just go PC gaming.
 
I think you ask for impossible, like really? They have to test it on every possible configuration, every OC you can possibly make? You know how many possibilities there are? Milions is pretty much a safe guess here, yes that's how big the hardware market is, and if you didn't realise that, now maybe you take that to consideration and rethink your request.

Alll i know is that game's have done these kinds of stress tests before. I remember FFXI having quite a nice one prior to launch. I don't see it as far fetched that you could somehow use the data from such a test and tweak settings automatically based on the results.
 
If I could deactivate motion blur, v-sync and AA (last two depending on the game) and use more AF on every console game I would be happy.
 
I would love having the option to turn off stuff like motion blur in console games. That's pretty basic and obvious so it shouldn't scare people.
woah woah woah keep that complicated tweaking stuff away from my consoles

turning off checkboxes what the hell man i dont have a phd you know

worst idea ever?

you have devs doing unplayable high detail modes simply for a box shot or too.

but what you'd actually eb playing is some grossy watered down version.



DO NOt WANT
That's actually the reality right now for multiplat games, oops.
 
If we're talking the type of options we generally get in PC games (not stuff like editing .ini files and more in-depth tweaks like we see in some of the PC modding threads here), I think it's fine. I'm all for giving players more options, although there is something to be said about the Steve Jobs approach of carefully controlling the user experience, too. That's the key, though: carefully. I don't think that's generally the case for multiplat games.
 
Kingdom Hearts: Birth by Sleep lets you adjust the color depth (less slowdown vs. prettier graphics) and adjust the system's internal clock speed (more battery life vs. less slowdown) right from the in-game menu.

And that's on the PSP. I never understood why 99% of console games don't have any graphical options besides "Brightness."
 
Top Bottom