FreedomOfSpeech
Banned
Always 60 fps.
What good are 'enhanced graphics' if they are 30 fps?
What good are 'enhanced graphics' if they are 30 fps?
And if the quality jump is 4x higher then you would take 15fps?!If the visual quality jump is 2x higher than visuals at 60fps, then I'll take 30fps. If not, I'll take 60fps.
Then I'd reduce resolution or use upscaling techniques (if available).And if the quality jump is 4x higher then you would take 15fps?!
IF those are the options what is your choice?
My thoughts exactly, the other way around would be just switching things off until getting the frame time right... but my i change it a bit (assuming that you wasn't thinking this way, if you were i misunderstood):Necessity is the mother of all invention.
If you target 60 as a foundation, you will push for more clever solutions to reach fidelity. Go with 30, you will be less disciplined than you could be.
60 needs to be the target this gen. The cpu imbalance off last gen for draw calls is no longer a factor.
We feel you bro, we and Shmunter are on it!PS5 games at 60fps performance mode seem PS4 Pro 2 games. 30fps quality modes is where next gen starts.
Consoles got none.Depend on how much fidelity I can get .
I take the 30fps witcher 3 e3 trailer over the 60fps retail one .
Isn't VRR only available on some TVs/monitors and not on others?60 FPS all the way. 30 FPS is just so bad. Playing Valhalla with performance mode, it's so much better than Quality mode.
For those who might claim that stable 30 is better than fluctuating 60 (between 50-60 FPS), it's really not, there is that thing called VRR and it makes everything smooth.
Don't ruin the fun, Dad. It's only jokes. And importantly, the interwebs.Cringe.
To be fair, we do not represent most people. We're mostly really into tech and care about fps or fidelity. I don't remember anyone complaining last gen that TLOU2 and GoT was 30fps.I am pleased with the poll results.
#justsaynoto30fps
Tell that to every game last generation. 30fps is perfectly playable. There wasn't a single time last generation where I thought the frame rate ruined a good game and they still made my jaw drop.Always 60 fps.
What good are 'enhanced graphics' if they are 30 fps?
How else do you expect a real generational leap in graphics if frame rate doesn't take a hit?I love how 30fps usually comes with a fancy, marketable buzzword before it.
Most of the time it's "Cinematic". Now it's "Fidelity".
Because, how else can you sell that shit to people?
How would you call 4k@30 fps VS 1440p@60fps if not fidelity?I love how 30fps usually comes with a fancy, marketable buzzword before it.
Most of the time it's "Cinematic". Now it's "Fidelity".
Because, how else can you sell that shit to people?
Isn't it crazy how different people prioritize different things? I go for maximum graphics. Even when I played on PC, I would often max out on resolution and graphics even if the framerate had to be lower.How would you call 4k@30 fps VS 1440p@60fps if not fidelity?
Maybe it's time for you to go see a doctor.
For me obviously depends on the game.Isn't it crazy how different people prioritize different things? I go for maximum graphics. Even when I played on PC, I would often max out on resolution and graphics even if the framerate had to be lower.
Nah. Only instance 30 FPS can be a better option is if the game drops from 60 to 35-40 FPS area way too fast, which is unusual and a result of poor optimization.Isn't VRR only available on some TVs/monitors and not on others?
Not really an option for everyone.
You are making it sould like 60fps is better but only if you have the right screen. Otherwise 30fps might be better?
This is a new thing. When the PS2/GC/ΧΒΟΧ came along, both graphics and frame rate improved. There were no sacrifices.How else do you expect a real generational leap in graphics if frame rate doesn't take a hit?
If you think stuttering about at 30fps counts as "fidelity" maybe you should fix your brain.How would you call 4k@30 fps VS 1440p@60fps if not fidelity?
Maybe it's time for you to go see a doctor and stop being an smartass who cant use the brain for once.
It's impossible to have no sacrifices for a higher frame rate. There were sacrifices back then, you just didn't know what those sacrifices were. Frame rate always comes at a cost because there's no such thing as unlimited power.This is a new thing. When the PS2/GC/ΧΒΟΧ came along, both graphics and frame rate improved. There were no sacrifices.
Same with previous gens. All had 60fps as standard.
There are no sacrifices if you make games from the ground up with 60fps in mind. How much better do you think Rogue Leader or Metroid Prime would look on the GC if those games were 30fps? They would be more detailed, sure, but by how much? These games are already some of the best looking GC games. And their smooth frame rate is a part of this.It's impossible to have no sacrifices for a higher frame rate. There were sacrifices back then, you just didn't know what those sacrifices were. Frame rate always comes at a cost because there's no such thing as unlimited power.
Also, it goes the other way too. Fidelity comes at the cost of frame rate. The more stuff you cram into each frame, the longer it takes to render each frame.
It's impossible to have no sacrifices for a higher frame rate. There were sacrifices back then, you just didn't know what those sacrifices were. Frame rate always comes at a cost because there's no such thing as unlimited power.
Also, it goes the other way too. Fidelity comes at the cost of frame rate. The more stuff you cram into each frame, the longer it takes to render each frame.
Maybe not so much "I think it's fine and so should you" but more "why are you so militant about 60fps".Sheesh it's like 30 fps has activists working for it.
Their argument is always: I think 30 fps is fine, therefore so should you.