Yeah i can clearly see how this option made you not buy this game :lol
I imagine he only eats pop tarts and hot pockets
Yeah i can clearly see how this option made you not buy this game :lol
If you want to get to 60 you are going to have to do work.
PC-games need some real prestanda tests so that there is no need for me to tweak the settings in the slightest if i don't want to.
This is one of the silliest "feisty counter argument" moments I've seen in awhile.
The jump from turning off v-sync in Bioshock alone was significant enough.
And when it comes to AA options, those start stacking up.
Then there's scaling down resolutions, both full screen and texture res...
Then you get into motion blur and the stuff you're taking about...
It adds up. 30-60 is very possible, depending on the game.
Ah yes, work. Effort. Engagement. You talk like developers doing what they're supposed to, working on games, is a bad thing. And don't say they have to put this effort into making game run well on one setting. They are supposed to put effort into making GOOD GAMES, you know, the ones that are fun, and no, don't say that the direction gaming is going is bad with bad design choices, over-reliance on cutscenes etc., I agree with that, but that's not on topic here. Some gamers, and I'm talking a significant amount of them, wand smooth 60 fps gameplay, and they cannot be arsed to play at 30 fps. That is their choice, I understand them. So if some company gives an option to tweak graphical options, so that you can either not bother with them and play at 30 fps if you don't care, or tweak them a bit and get those nice 60 fps, then people who are okay with how things are now will buy the game regardless of that (assuming it's good, and they want it, but I already said we're not discussing that), and the 60 fps crowd will also be satisfied with the product. More sales, voila. They just need to put some work into it, but that should be out of question, because they're supposed to do it regardless, they get paid for it.
Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?
So giving a choice to people makes you lazy. Well, I'm sorry, I'm at loss of words here, post like this make me feel bad. Like really bad. I'm seriously concerned about the future of gaming.
If you want smooth 60fps all the time then go spend 3 grand on a new PC and enjoy. Mine's an i7-970@4ghz with 2 OCed GTX 570s
If you spend more than $1500 on a PC you're being extravagant.Okay. It's good that you're around to provide your invaluable insight into the subject at hand.
Bioshock is an argument? bioshock on unreal 2.5?
Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?
I still hold that if you want an acceptable looking 60fps game you need to have 60fps as a target, not 30fps as a target and then slash and burn until you hit 60.
I don't have any problem with cut scenes, you really need to stop going off on tangents. When I said it would take work what I meant was it would take work.... that could otherwise be used to improve a 30fps mode. I don't understand at all why I shouldn't be allowed to talk about graphical focus. If you want smooth 60fps all the time then go spend 3 grand on a new PC and enjoy. Mine's an i7-970@4ghz with 2 OCed GTX 570s
I'm not saying it would be lazy I said people would call them lazy, they would be perceived as to not having put effort into their 60fps mode, which would lead to some individuals calling them lazy because as we've seen from this thread some people have no problem calling devs lazy for no good goddamn reason at all.
$3k? Bahwahaha. You coukd easily get that done with $1kOkay. It's good that you're around to provide your invaluable insight into the subject at hand.
Bioshock is an argument? bioshock on unreal 2.5?
Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?
I still hold that if you want an acceptable looking 60fps game you need to have 60fps as a target, not 30fps as a target and then slash and burn until you hit 60.
I don't have any problem with cut scenes, you really need to stop going off on tangents. When I said it would take work what I meant was it would take work.... that could otherwise be used to improve a 30fps mode. I don't understand at all why I shouldn't be allowed to talk about graphical focus. If you want smooth 60fps all the time then go spend 3 grand on a new PC and enjoy. Mine's an i7-970@4ghz with 2 OCed GTX 570s
People are probably going to laugh, but I'd suggest console games offer a single video settings toggle in options. Call it 'Fast/Pretty'.
Fast: Optimized for speed, minimal effects. Guaranteed to run at a framerate of 60 FPS, though this may entail reduced shader quality/lighting/screen resolution, tearing, etc., depending on how much needed to be turned off in order to hit that target.
Pretty: All effects enabled. Default setting for games.
Keep it nice and simple.
$3k? Bahwahaha. You coukd easily get that done with $1k
So because you wouldnt want to tweak it nobody else should be able to have that option?
$3k? Bahwahaha. You coukd easily get that done with $1k
BF 3 is also guilty.
I remember a screenshot from a presentation with the Tunnel map.
Where it showed a frame with pre post-processing and post post-processing(color correction).
We aren't saying that devs go with the lowest setting.
I expect that devs release a game where everything that can be on will be on and it will run 30fps or 60fps.
Some people can get sick or nausea from certain effects like a bad motion blur.
For those people if they can set motion blur off the game becomes more enjoyable for them.
Or a dev fucked up a post processing AA methode that destroys IQ instead of improving it.
Wouldn't be better to put that post processing filter off.
And it would probably net you a more stable framerate.
The game will still be V-synced so fps will always be capped at 30 or 60fps.
This complaint makes no sense.This thread has got me thinking on something related to PC games that I hate.
Why can't there be an option to say "I want 60fps in this resolution, make it so". Surely it wouldn't be that hard for a game to run a quick torture test, take the average FPS, change a few settings automatically (ranked by fps impact / visual quality loss) and then redo the torture test until complete?
That's the one aspect of PC gaming that I really, really hate. I just want it to run at my native resolution and at a certain minimum fps whilst still retaining as much of the eye candy as possible. Is that too much to ask for? The developers obviously know far better than me what scenes in the game represent the average / peak loads that you would expect to see.
This thread has got me thinking on something related to PC games that I hate.
Why can't there be an option to say "I want 60fps in this resolution, make it so". Surely it wouldn't be that hard for a game to run a quick torture test, take the average FPS, change a few settings automatically (ranked by fps impact / visual quality loss) and then redo the torture test until complete?
That's the one aspect of PC gaming that I really, really hate. I just want it to run at my native resolution and at a certain minimum fps whilst still retaining as much of the eye candy as possible. Is that too much to ask for? The developers obviously know far better than me what scenes in the game represent the average / peak loads that you would expect to see.
This complaint makes no sense.
q[Anyways, you can always scale down resolution but again I doubt players would accept that scaling for 60fps. Imagine 30fps mode is 720p and 60fps mode is 540p. Don't you think people would scream "lazy devs" until they were blue in the face?]q
You are aware that massively popular games like CoD and Halo don't actually render at 720p, right? That doesn't stop them from selling millions. It's pretty obvious that a lot of gamers do care about performance more than image quality. A simple option to select your render resolution (600p, 720p, 1080p) would go a long way in letting console gamers optimize their experience. The ability to adjust AA, AF and basic screen filters (film grain, motion blur, etc) would go even further without requiring any significant amount of technical knowledge from the player.
It really shocks me that people are against the idea of having choice and being able to tailor their experience to their own preferences. I didn't realize the mentality between PC gamers and console gamers was so vastly different.
Exactly. I really don't want to sit there messing around with settings for an hour to get the perfect balance. The dev knows exactly what would be a good scene to run a torture test with, and which options would have the best effects on the frame rate.What's not to understand, there are so many variables to getting the best out of a game with PC gaming, having the ablity to simplify the process is always a good thing.
I don't say I didn't understand what he's asking, I said it makes no sense.What's not to understand, there are so many variables to getting the best out of a game with PC gaming, having the ablity to simplify the process is always a good thing.
the question: would you mind to be able to tweak some advanced graphics options like texture quality or visual effects on consoles if it let you play at 60 fps in result?
If I want to dick around with settings I'll turn to my PC.
Seriously?? Fucking bioware. that's the only way i could enjoy those games.Btw. I hate that they removed the options for the grainy filer and the motion blur in Mass Effect 3
Why would it take an insane amount of time? As a developer myself I could add something along those lines in a matter of days, even quicker than that after the first time I did it. There is a very limited subset of settings the demo would need to take account of and they've clearly already got a reasonable idea of what they may be with some of the suggested settings (medium, high etc).
...and I wish it was just a couple of minutes. More often than not the settings I start with end up being too high 10 minutes into the game, and then those settings are wrong half an hour into the game... all because I hadn't yet hit a scene representative of the loads the game would be putting on the hardware on average.
Maybe you've got ridiculous hardware, so it's set and forget, but for me it takes a hell of a lot of tweaking for some games unless I want to have tearing or framedrops all over the place. It's one of the reasons that I play far more games on my consoles... at least at the start of the gen when they're comparable in power to PCs. Towards the end of each gen (now) I find myself back on PCs, and annoyed at the amount of tweaking required.
Yes, I am. You are aware how much shit gaffers talk about this gen not actually being HD. Yes? I also think it's pretty clear that there are no titles that offer a direct comparison between performance and image quality so trying to suggest that "it's pretty obvious that a lot of gamers do care about performance more than image quality" rings hollow.
Why would it take an insane amount of time? As a developer myself I could add something along those lines in a matter of days, even quicker than that after the first time I did it. There is a very limited subset of settings the demo would need to take account of and they've clearly already got a reasonable idea of what they may be with some of the suggested settings (medium, high etc).
...and I wish it was just a couple of minutes. More often than not the settings I start with end up being too high 10 minutes into the game, and then those settings are wrong half an hour into the game... all because I hadn't yet hit a scene representative of the loads the game would be putting on the hardware on average.
Maybe you've got ridiculous hardware, so it's set and forget, but for me it takes a hell of a lot of tweaking for some games unless I want to have tearing or framedrops all over the place. It's one of the reasons that I play far more games on my consoles... at least at the start of the gen when they're comparable in power to PCs. Towards the end of each gen (now) I find myself back on PCs, and annoyed at the amount of tweaking required.
I think you ask for impossible, like really? They have to test it on every possible configuration, every OC you can possibly make? You know how many possibilities there are? Milions is pretty much a safe guess here, yes that's how big the hardware market is, and if you didn't realise that, now maybe you take that to consideration and rethink your request.
fuck off with that noise.
If I want to dick around with settings I'll turn to my PC.
If only all games could be played on PC....
woah woah woah keep that complicated tweaking stuff away from my consolesI would love having the option to turn off stuff like motion blur in console games. That's pretty basic and obvious so it shouldn't scare people.
That's actually the reality right now for multiplat games, oops.worst idea ever?
you have devs doing unplayable high detail modes simply for a box shot or too.
but what you'd actually eb playing is some grossy watered down version.
DO NOt WANT