True. And 200+ gsync is better than 120.Well, 120 is better than 60.
Big HDTVs can't come soon enough with VRR. Enthusiast monitors are already amazing.
True. And 200+ gsync is better than 120.Well, 120 is better than 60.
But in your big open world games or single player epics I stand by my point that it makes no sense to do anything but 30fps.
But you can't. The whole point is that consoles demand that you choose one or the other.
Rocket League is a multiplayer online game. I specified single player games. Obviously once actual human competition is involved response time becomes critical and framerate plays its part. This is why you typically see 60fps in multiplayer games like fighters, racers, online shooters etc. So this is certainly a case of genre by genre.
But in your big open world games or single player epics I stand by my point that it makes no sense to do anything but 30fps.
It makes a difference in input and reaction. Don't kid yourself otherwise.
Movies are different though because you don't feel the framerate because you aren't controlling the action. I'm perfectly fine with solid 30, but it blows my mind that some people argue that 30 is BETTER. I can't think of any situation where I wouldn't choose to run a game at 60 given the choice.Meanwhile 24 FPS is still the golden standard for non-interactive fiction.
I believe 30 FPS is best for high fidelity cinematic experiences, 60 FPS should be mandatory for anything that requires very precise input (like fighting games) or very fast twitch shooters.
At the highest level of play, sure. But if you think more than 1% of players actually notice the difference created by 0.017 seconds of faster input, you're the one who's kidding yourself.
Meanwhile 24 FPS is still the golden standard for non-interactive fiction.
I believe 30 FPS is best for high fidelity cinematic experiences, 60 FPS should be mandatory for anything that requires very precise input (like fighting games) or very fast twitch shooters.
People say: 60fps vs eye candy (on console at least).
But people forget 60fps IS eye candy!
Like some people said before, give me options to choose from. Like Nioh.
Just accept that not everybody cares about frame rates.
Yes I can tell the difference between 60 and 30. I just don't care..
Buy a PC. It's not that difficult a problem to solve, if it means that much to you.
This is something that I never understood that some hardcore PC gamers can't get and why they keep berating consoles gamers about.
If someone cared enough about framerate, they would already have a PC, it's as simple as that.
Asking "WHY DON'T YOU CARE ABOUT FRAMERATE?" is a stupid question. Some people just don't.
Now, people saying that they can't completely tell the difference are just being ignorant. You might not play enough 60fps games to tell the difference on the spot, but there is no medical condition that does not allow you differentiate if shown some tests.
I meant more along the lines of not increasing ploycounts or world size by huge amounts.FPS wouldn't change that.
Halving performance =/= halving content.
Aren't there more 60fps games this generation than last?
Feels that way to me anyway, in which case the trend has swung in a positive direction.
The thread can be closed after this commentsI can't believe we are in 2017 and people still expect a standard 60 FPS on consoles.
Try playing quake at 30fps, it's pretty terrible.I don't even think 60 fps is standard on PC yet unless you think every pc gamer has a decent rig. I am sure most can get 60fps if they dial down the graphics settings. It can be standard on consoles in the same way but why sacrifice graphics for something that is not necessary to sell games to the general gaming public, who isn't counting fps numbers. 30 fps is perfectly playable, whether on console or pc.
Maybe not. If VR takes off and there's a desire to have all games VR capable, wouldn't that lead to pushing 60fps as the new floor on framerate?Sorry but it will never be a standard in consoles.
2027 you will say the same.
I don't even think 60 fps is standard on PC yet unless you think every pc gamer has a decent rig. I am sure most can get 60fps if they dial down the graphics settings. It can be standard on consoles in the same way but why sacrifice graphics for something that is not necessary to sell games to the general gaming public, who isn't counting fps numbers. 30 fps is perfectly playable, whether on console or pc.
I wasn't really speaking specifically to the OP.Reading the OP isn't that difficult either. He already plays on PC.
It's a valid argument. Consoles should aim for 60fps and not being so pretty but doing so means that can't use screenshots to sell a game.
Well, unless they're just bullshots but then they're kinda pointless anyway.
Everything is going in the wrong direction if you ask me. Fuck resolution. Fuck making games prettier. Games need to be more responsive. Games need to be 60fps as a standard.
Reading the OP isn't that difficult either. He already plays on PC.
It's a valid argument. Consoles should aim for 60fps and not being so pretty but doing so means that can't use screenshots to sell a game.
Well, unless they're just bullshots but then they're kinda pointless anyway.
Everything is going in the wrong direction if you ask me. Fuck resolution. Fuck making games prettier. Games need to be more responsive. Games need to be 60fps as a standard.
I wasn't really speaking specifically to the OP.
It's not a valid argument. Consoles have to produce games that attract customers over the course of 5-7 life cycle. They don't have the luxury of an ever-improving hardware configuration to constantly be pushing the envelope.
Most people don't care about 60 fps. The responsiveness they get from 30 fps is enough. Bottom line is that console makers and developers do not have any incentive to make sacrifice resolution for frame-rate. In fact, they are incentivized towards resolution.
The PC market exists for those gamers who want the power to have higher frame rates. Trying to push that mentality on the console market is non-starter.
Let's use Uncharted 4 as an example.
Main game is 1080p/30 and looks amazing.
MP is 900p/30 and looks good.
Do you think Uncharted 4 would be as memorable for people as it is now if the whole game looked like the MP?
Do you think this scene:
would've made the same impression it did on a lot of people if the game looked worse?
I don't think so.
Proves my point. It's memorable because it looks pretty.
This is something that I never understood that some hardcore PC gamers can't get and why they keep berating consoles gamers about.
If someone cared enough about framerate, they would already have a PC, it's as simple as that.
Asking "WHY DON'T YOU CARE ABOUT FRAMERATE?" is a stupid question. Some people just don't.
Now, people saying that they can't completely tell the difference are just being ignorant. You might not play enough 60fps games to tell the difference on the spot, but there is no medical condition that does not allow you differentiate if shown some tests.
I'm so looking forward to variable refresh rates becoming the norm so we can have much more nuanced conversations. A huge part of the problem right now is the frame rate cliff. I suspect many of the 60fps or bust crowd would be perfectly happy with titles that dip down from the 60s and 70s into the 40s and 50s so long as you still get nice smooth frame pacing.
Proves my point. It's memorable because it looks pretty.
It would be more responsive and fluid, not necessarily better.It still would be a better game, and for the record I love the Uncharted series, if it ran at 60fps. The Last of Us is a better game on PS4 because of the 60fps option.
It won't ever be, some prioritize eye candy and others performance.