• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

40 FPS is Becoming a Category in the Console Space

All those Fildelity modes are locked to 30FPS but there's still a lot of headroom left on the table, GoW running at 40-50FPS in unlocked FPS mode being the best example here, so unlocking that extra headroom doesn't sacrifice the visuals at all. 40FPS is a good compromise/workaround for lack of VRR, it may seem like just 10 extra frames, no biggie, but those 10FPS are in fact 33% more than what you'd get otherwise, and the difference is really quite significant:


The point I'm getting at is this. I don't think most FPS oriented gamers would settle for 40fps when a higher FPS mode is already available. On the other hand most fidelity oriented gamers are more than fine with the 30fps, provided that all available headroom is used for maximum fidelity. In the case of GoW maybe the better solution would have been to tweak the visuals to either reach 60fps (catering to the performance crowd) or use the headroom to improve graphical quality even further at 30fps (catering to the graphics crowd). 40fps is neither here or there in my view.
 
Last edited:

Soodanim

Gold Member
Outside of monitors the industry largely skipped 1440p. It’s only MS in the console space that bothered, and they’ve always been good for resolution support owing to their PC background.

I don’t see 40fps becoming a standard, as it relies on that small group of 120hz display that sits between 60hz and 120hz+VRR. By the time you’ve got enough people that have 120hz displays for 40fps to be a worthwhile inclusion on consoles, most of those displays will have VRR anyway so an unlocked option makes more sense.

Don’t forget that monitor users are such a tiny niche of the console space that they - for better or worse - are not worth catering to.
 

ZywyPL

Banned
The point I'm getting at is this. I don't think most FPS oriented gamers would settle for 40fps when a higher FPS mode is already available. On the other hand most fidelity oriented gamers are more than fine with the 30fps, provided that all available headroom is used for maximum fidelity. In the case of GoW maybe the better solution would have been to tweak the visuals to either reach 60fps (catering to the performance crowd) or use the headroom to improve graphical quality even further at 30fps (catering to the graphics crowd). 40fps is neither here or there in my view.

But that's impossible to achieve, the framerate is always variable by its nature, dependent on the displayed scene, if you aim for 30 you always need a headroom to be prepared for worst case scenarios, if the devs would push the visuals to be just on the edge of 30 like you suggest the games would drop framerate to 20s or even below as soon if the on-screen action gets more dense than the usual walking around an empty scenery, just like PS360 games did by the end of the generation that barely run at 30FPS because the devs kept pushing and pushing for better visuals leaving less and less headroom.

Bottom line is, nobody loses anything with 40FPS mode, people who prefer maximum visuals get exactly they want, 4K30 with maxed-out graphics, those prefering 60FPS also get a game mode for them, and those who prefer maximum visuals but tend to have a 120Hz display can squeeze a bit of extra performance from their TVs, it's basically a win-win-win scenario.
 
Last edited:

Daymos

Member
So 3 repeated frames frames frames of 40, 40, 40 is better than 1 frame at 60/60? Nope.

How about 60, 60 at 120hz.. we can do it! Even 40 at VRR /40hz is bad, that's too slow for your eyes I think..
 
Last edited:

Amiga

Member
In the case of GoW maybe the better solution would have been to tweak the visuals to either reach 60fps (catering to the performance crowd) or use the headroom to improve graphical quality even further at 30fps (catering to the graphics crowd). 40fps is neither here or there in my view.

30fps locked needs headroom so it doesn't drop or stutter. so engine develops make enough frame-time room for the gameplay/art designers to work. after all work is done many end up with extra performance that averages 45fps. at this point image quality is already optimal. the only thing to add after the main art design is done is makeup.

developers that don't give enough room end up with a mess like Cyberpunk.

and a 40fps option costs nothing. console games already have performance mode for 60fps that needs to scale down quality to find 100% more performance. for 40fps they just need 30% more. so no reason at all not have this.
 

yamaci17

Member
its about giving people a choice, its not like by introducing 40 fps modes they're shunning 30 fps modes? why not give users with 120 hz screens a faster alternative at 40 fps instead of 30 fps? 60 hz people can still happily play at 30 fps, 40 fps at 120 hz will perfectly sync, so it wont be choppy or juddery.

repeating 1 frame or 10 frame does not make a frame more choppy. you can practically run a 30 fps game on 240 hz and each frame would be repeated 6 times and you would still get the same result you would get on a 60 hz screen.

so 40 fps running at 120 hz is no different than running it at 80 hz or 240 hz for that matter. as a matter of fact, higher refresh rate will slight have less input lag

so no, 30/60 is not the holy grail of having a smooth, non juddery experience. i've played a lot of 30 fps games @120 hz (frames repeated 4 times) and i can safely say that the experience matches the one at @60 hz (frames are repeated 2 times) and can actually turn out better due to slightly having less input lag

cbZlG2p.png


nvidia users can try this themselves with nv profile inspector, its very easy to setup

nv inspector: https://github.com/Orbmu2k/nvidiaProfileInspector/releases

i still prefer vrr+rivatuner frame lock instead. say i'm confident that i will get locked 40 fps in a game, i just enable freesync and lock the FPS directly to 40 via rivatuner. VRR will work between 119-121 hz (low frame rate compensation aka LFC) and perfectly provide effective vsync without enabling vsync itself. this is much better, because you get rid of the huge input lag vsync introduces
 
Last edited:

e&e

Banned
Is there a reason MS/Sony don’t have a system wide “Performance or Graphics” modes which the devs can just set up with if/then checks in the games? Or is that a thing already?
 

01011001

Banned
Is there a reason MS/Sony don’t have a system wide “Performance or Graphics” modes which the devs can just set up with if/then checks in the games? Or is that a thing already?

Sony has that but most games don't really support it, while games that do support it then often won't let you change it in game.

a system wide setting is nonsense, every game is different and therefore having the option in game is way better than any system setting.

so it is wise of the devs that don't use the syetem setting on PS5
 

yamaci17

Member
Haven't test myself but is it that of a big deal between between 30 and 40?
yes it can be

perceived smoothness increases by a huge amount from 30 to 60

30 fps - 33.3 ms
40 fps - 25 ms (8.3 ms reduction compared to 30 fps)
50 fps - 20 ms
60 fps - 16.6 ms
120 fps - 8.3 ms (8.3 ms reduction compared to 60 fps)
240 fps - 4.1 ms

same reason why going from 30 to 60 is such a huge deal and going from 60 to 240 is not that much of a huge deal

you get 16.6 ms reduction going from 30 to 60, and you only get 10 ms reduction going from 60 to 240

that's why anything above 120 fps mostly provides diminishing results in terms of fluidity. 120-144 fps is usually the last stop where you get really efficient returns

you might think this is some kind of joke, but its not. imagine how things drastically change when going from 15 fps to 30 fps. but the same effect won't happen going from 100 fps to 115 fps

this is why a 10 fps increase at 30 fps mark can be really huge
 
Last edited:

SSfox

Member
yes it can be

perceived smoothness increases by a huge amount from 30 to 60

30 fps - 33.3 ms
40 fps - 25 ms (8.3 ms reduction compared to 30 fps)
50 fps - 20 ms
60 fps - 16.6 ms
120 fps - 8.3 ms (8.3 ms reduction compared to 60 fps)
240 fps - 4.1 ms

same reason why going from 30 to 60 is such a huge deal and going from 60 to 240 is not that much of a huge deal

you get 16.6 ms reduction going from 30 to 60, and you only get 10 ms reduction going from 60 to 240

that's why anything above 120 fps mostly provides diminishing results in terms of fluidity. 120-144 fps is usually the last stop where you get really efficient returns

you might think this is some kind of joke, but its not. imagine how things drastically change when going from 15 fps to 30 fps. but the same effect won't happen going from 100 fps to 115 fps

this is why a 10 fps increase at 30 fps mark can be really huge
Sounds great, gotta try this myself. But i think 60fps will be the best pick in most cases.
 
Although it's not really a linear progresion, think of the difference that 10fps can make between 20fps and 30fps. Although the frame number is the same between 30fps and 40fps, the percentage difference is a little less.
As wonderful as VRR is, when you're around the 30fps mark there's not so much that VRR can do to assist. It's far more useful for dealing with discrepencies at higher frame rates.
 

Dream-Knife

Banned
While 40fps is better than 30 it's still very laggy and choppy compared to the creamy smooth feeling of 60 fps, and that is still not even close to how silky smooth 120+ is. I just whish anything below 60 wouldn't ever be allowed in any title what so ever, 60 should be the absolut bare minimum standard all games should run in.
It's all what you're used to. I couldn't tell the difference between 30 and 60 until I got a 170hz monitor. Now anything under 100 feels slow.

Does add some atmosphere to Dark Souls though.
 

Larxia

Member
I have never understood this myth about variable / adaptive refresh-rate making everything smooth like if it was magic.

Even with g-sync, which is supposed to be the best at this, I can still totally feel if a game drop from 60 fps to 50, just like I would on a regular monitor, it's exactly the same.

The only big advantage (and that's an important one, so I'm not saying this tech is bad) of variable refresh-rate is that you don't need to rely on v-sync anymore to avoid tearing, which is a great plus, since it gets rid of v-sync latency. But that's really just it, it doesn't magically make 40 fps feel smoother than on a regular monitor or whatever, the framerate feels the same, and the fluctuations too.
 
Last edited:

JeloSWE

Member
It's all what you're used to. I couldn't tell the difference between 30 and 60 until I got a 170hz monitor. Now anything under 100 feels slow.

Does add some atmosphere to Dark Souls though.
Wow didn’t know they made 170hz monitors, it doesn't divide evenly into anything useful, weird choice.

I get that if one grew up with last gen consoles, one might think 30fps is normal and okay, but I've played so many great games at 60 fps through out my gaming life that anything below 60 seems like a pathetic compromise to achieve good looking screen shots for marketing. One should make a game minimum 60, then make it look as good as possible with the HW available.
 
Last edited:

rofif

Can’t Git Gud
I would play this shit at 24 fps if it looked great.
Remember Dark Souls? It looked pretty incredible in 2011 and ran 12 fps in some spot :p
But al I remember from this game are visuals and.. a good game.
After years pass, you remember the visuals, not framerate. 40 is undeniably better of course so the more 40 and 60 games, the better
 

Dream-Knife

Banned
Wow didn’t know they made 170hz monitors, it doesn't divide evenly into anything useful, weird choice.

I get that if one grew up with last gen consoles, one might think 30fps is normal and okay, but I've played so many great games at 60 fps through out my gaming life that anything below 60 seems like a pathetic compromise to achieve good looking screen shots for marketing. One should make a game minimum 60, then make it look as good as possible with the HW available.
It works for most PC games since there's no fps cap. Turn on freesync and limit gpu with radeonchill and you're all set.

I agree with 60 being minimum, but co soles have always done this. They promised 120 with PS3.
 

Dream-Knife

Banned
I would play this shit at 24 fps if it looked great.
Remember Dark Souls? It looked pretty incredible in 2011 and ran 12 fps in some spot :p
But al I remember from this game are visuals and.. a good game.
After years pass, you remember the visuals, not framerate. 40 is undeniably better of course so the more 40 and 60 games, the better
It's been 20 years and I still remember the stutter from Perfect Dark.

I do also play it regularly though.
 

Kenpachii

Member
yes it can be

perceived smoothness increases by a huge amount from 30 to 60

30 fps - 33.3 ms
40 fps - 25 ms (8.3 ms reduction compared to 30 fps)
50 fps - 20 ms
60 fps - 16.6 ms
120 fps - 8.3 ms (8.3 ms reduction compared to 60 fps)
240 fps - 4.1 ms

same reason why going from 30 to 60 is such a huge deal and going from 60 to 240 is not that much of a huge deal

you get 16.6 ms reduction going from 30 to 60, and you only get 10 ms reduction going from 60 to 240

that's why anything above 120 fps mostly provides diminishing results in terms of fluidity. 120-144 fps is usually the last stop where you get really efficient returns

you might think this is some kind of joke, but its not. imagine how things drastically change when going from 15 fps to 30 fps. but the same effect won't happen going from 100 fps to 115 fps

this is why a 10 fps increase at 30 fps mark can be really huge

33 to 25 isn't a big deal, 33 to 16 is and 16 to 8 is and 8 to 4 is because it halfs the input lag.

25ms is better then 33ms but its still not where near 60 fps.

And frankly whats the point of even focusing on 40 fps, isn't the idea of 30 fps to push visuals? u basically are sacrificing that for a bit more performance.

Honestly pointless, just focus on 60 fps then and actually got a jump up in responsiveness.

Also saying 60 to 240 isn't a massive jump u honestly never played on it.
 
Last edited:

01011001

Banned
33 to 25 isn't a big deal, 33 to 16 is and 16 to 8 is and 8 to 4 is because it halfs the input lag.

25ms is better then 33ms but its still not where near 60 fps.

And frankly whats the point of even focusing on 40 fps, isn't the idea of 30 fps to push visuals? u basically are sacrificing that for a bit more performance.

Honestly pointless, just focus on 60 fps then and actually got a jump up in responsiveness.

Also saying 60 to 240 isn't a massive jump u honestly never played on it.

almost no 30fps locked game actually runs at 30fps if it was unlocked.

look at Control on Series X, locked to 30fps in RT mode but runs constantly above 40fps basically, it's even close to 50fps in scenes that aren't super demanding.

yet it is locked to 30fps in order to have consistent framepacing.

VRR TVs and 120hz TVs give games like that the opportunity to feel way smoother to play with barely any visual impact if any impact at all.

the only reason these games are locked to 30fps is therefore often the fault of 60hz TVs with no VRR support
 
Last edited:

Dream-Knife

Banned
almost no 30fps locked game actually runs at 30fps if it was unlocked.

look at Control on Series X, locked to 30fps in RT mode but runs constantly above 40fps basically, it's even close to 50fps in scenes that aren't super demanding.

yet it is locked to 30fps in order to have consistent framepacing.

VRR TVs and 120hz TVs give games like that the opportunity to feel way smoother to play with barely any visual impact if any impact at all.

the only reason these games are locked to 30fps is therefore often the fault of 60hz TVs with no VRR support
I think a more appropriate question is ask is, what percentage of the people who buy a console care enough about VRR, or will spend that amount of money on a new TV?

They could get a monitor, but there's no 1440p or displayport support.
 

JCK75

Member
As a PC gamer that demands 144hz on multiplayer shooters a secret I've long kept to myself is that in single player games I'm perfectly content with anything 40fps and up so long as it's stable and looks good.
 

01011001

Banned
I think a more appropriate question is ask is, what percentage of the people who buy a console care enough about VRR, or will spend that amount of money on a new TV?

They could get a monitor, but there's no 1440p or displayport support.

does that matter? adding an option to disable a 30fps lock or adding a 40fps mode for 120hz displays barely takes any work
 

martino

Member
I've tried it and i like it.
it's not far the target i have with my current config to be as near as possible to 4k for now (48fps 144/3).
But did i miss other games or this is a generalization over one example ?
 
Last edited:

jaysius

Banned
Better optimization of the games is the answer and setting realistic goals and targets so nobody gets pissed at substandard framerates.

Jesus, started talking sense there for a second. The fuck is wrong with me.
 
Top Bottom