• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why do frame rates go 30, 60, 120? Why not 90?

Just something I thought of today after watching HokiHoshis new video on how frame rate effects performance in racing games. (Spoiler: it matters)



Clearly 30 fps is the worst, and this aligns with my personal experience as well. It’s also part of the reason I dropped Monster Hunter Rise on switch and moved to PC. That buttery smooth 60 fps makes the game feel and look better. It’s also why I only play Forza Horizon 5 on my PC, where I can have good graphics and 60fps.

But in these discussions it’s always 30 or 60, and the big dogs sometimes talk about 120. Why does the frame rate need to double again instead of just adding 30? 90 would still be divisible by it. It would still be better then 60 and wouldn’t cost as much resource wise. Yet I’ve never really seen this mentioned. Any specific reason, or is it just not enough of a difference?
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
I'm not sure of the big explanation, but it had ime thing to do with vertical synchronisation.

A gsync or free sync monitor doesn't need this, and can rum with variable framerate without these problems.
 

64bitmodels

Reverse groomer.
well, you don't see monitors advertising themselves as 90hertz, like how you don't see TVs advertising themselves as 1440p, so that's probably one reason
 

01011001

Banned
90fps would be a really good target Framerate, but that's only possible with VRR because without VRR the Framerate would not divide evenly into the screen refresh.

but if you target 90fps, then having the framate unlocked up to 120fps is the logical choice as fluctuations between 120fps and 90fps will still feel completely smooth on a VRR display. meaning locking to 90fps would be completely useless in most cases as a locked 90fps would mean there's enough headroom to guarantee that, meaning just running it unlocked would most likely result in 100+fps at all times in such a title
 
Last edited:

Hugare

Member
Why does the frame rate need to double again instead of just adding 30? 90 would still be divisible by it.

Calculating Oh No GIF by MOODMAN


... You may want to try to divide 120 by 90 again, OP

The result may be shocking

Math is hard

Now the serious answer: 120hz divides evenly at 60, 40 and 30 hz. It offers more options.

Now a 90hz display would only offer 90 or 30hz, which would suck. That's why 120 hz is the current standard.

You wont want to consume content at 90 fps on a 120 hz display, cause that would cause some serious framepacing issues, making your experience less than ideal.

If you have a Freesync/Gsync panel tho, you can lock games at 90 fps with no problem, of course. 'Cause that would adjust the framepacing on the fly.
 
Last edited:
Everything is arbitrary BS is the short answer

The real answer is, content is made at 24, 30 and 60 fps traditionally and developers will extrapolate to refresh rates that are common and multiples of their native framerate. 90Hz isn't a common refresh outside of laptop panels, it's certainly not seen anywhere near console development, and it only really solves for 30fps. I sympathise though, as I am writing this from a laptop with a 90Hz panel.
 

Three

Member
For the same reason we didn't have 40fps games with 60hz even though both 20 and 30fps fit. It doesn't divide without a remainder. For things like 24fps motion film look up three-two pulldown. Frankly most framerate and refresh differences are way overblown and most aren't visible to the masses. For something like PSVR 90hz might be a good option in VR.
 
Last edited:
Calculating Oh No GIF by MOODMAN


... You may want to try to divide 120 by 90 again, OP

The result may be shocking

Math is hard

Now the serious answer: 120hz divides evenly at 60, 40 and 30 hz. It offers more options.

Now a 90hz display would only offer 90 or 30hz, which would suck. That's why 120 hz is the current standard.

You wont want to consume content at 90 fps on a 120 hz display, cause that would cause some serious framepacing issues, making your experience less than ideal.

If you have a Freesync/Gsync panel tho, you can lock games at 90 fps with no problem, of course. 'Cause that would adjust the framepacing on the fly.
30 goes into 90 3 times

Edit.
Always expect the internet to be an asshole at the drop of hat with only the slightest provocation. Go figure.
 
Last edited:

Pagusas

Elden Member
because 120 is divisible by 24/30/40/60. it’s a great number for supporting so many frequencies without needing a pull down. 90 is not.
 

CuNi

Member
and 24? and 60? 90 into 120?

I don't think you're understanding this.

24 doesn't go into 30 nor 60, so what exactly is your point here?
120 is the only number you can divide by 24, 30 and 60, which is why a "math" explanation is destined to be wrong to begin with.
 

baphomet

Member
24 doesn't go into 30 nor 60, so what exactly is your point here?
120 is the only number you can divide by 24, 30 and 60, which is why a "math" explanation is destined to be wrong to begin with.

We're talking about 120hz displays.
 
Last edited:

CuNi

Member
You need to divide the hz of the display (120, 90, 60) by fps and the result need to be a integer number. 120 by 90 is 1.333... so the frames can,t be displayed at smooth pace.
We're talking about 120hz.

The question was why, before VRR (since that makes this all obsolete), TVs and screens made the jump from 60 to 120 and not for example 90, to which there is no definitive answer.
At some point it was simply decided, we will never know if the answer is "because you can extrapolate easier" or if it's "easier to implement/build" or perhaps even "gave the best increase in perceived smoothness compared to 75/90hz".

We'll most likely never know, and I don't think there is a definitive answer to this.

Remember, we even had/have 48fps movies (HFR).
 

baphomet

Member
The question was why, before VRR (since that makes this all obsolete), TVs and screens made the jump from 60 to 120 and not for example 90, to which there is no definitive answer.
At some point it was simply decided, we will never know if the answer is "because you can extrapolate easier" or if it's "easier to implement/build" or perhaps even "gave the best increase in perceived smoothness compared to 75/90hz".

We'll most likely never know, and I don't think there is a definitive answer to this.

Remember, we even had/have 48fps movies (HFR).

Because 60hz content can't be displayed correctly with a 90hz display.

That's always been the answer.

Not some age old question.
 

Knightime_X

Member
90fps does not divide evenly on a 60hz or 120/244hz monitor.
There would be overlaps that would cause noticeable frame stutter that is most definitely NOT pleasing to the eye.

24fps has tons of blurry interpolation frames to fill in the gaps that wouldn't work for games.
That's why 24fps looks ok on a 60hz monitor.
 
Last edited:

01011001

Banned
24 doesn't go into 30 nor 60, so what exactly is your point here?
120 is the only number you can divide by 24, 30 and 60, which is why a "math" explanation is destined to be wrong to begin with.

24fps is an unfortunate remnant of a time long past. movies should have switched to 30fps a long time ago.

but many modern TVs actually support 24hz playback, so TV tech solved the issue to a point.
 
24fps is an unfortunate remnant of a time long past. movies should have switched to 30fps a long time ago.

but many modern TVs actually support 24hz playback, so TV tech solved the issue to a point.
the only good thing about the hobbit movies were that they tried to push start 48fps. shame it didn't work out.
 
Last edited:

Hugare

Member
30 goes into 90 3 times

Edit.
Always expect the internet to be an asshole at the drop of hat with only the slightest provocation. Go figure.
Take a chill pill, my man.

It was a light hearted joke. That's why I also gave you a serious answer on my post.

30 goes into 90. But on a 90 hz display, you would only have 30, 45 and 90hz options.

With 120hz you have 24, 30, 40, 60 and 120hz, and these options are important because movies/tv shows use 24 fps and most console games use 30/60 fps.

"Why did the industry chose 24 fps for movies and 30/60 for games and not 40 fps?"

My guess is that they had to choose something to be the standard and these refresh rates are good enough to be acceptable by the majority of people. More fps = more cost in terms of hardware, not only for gaming, but also to make movies and etc.

The question was why, before VRR (since that makes this all obsolete), TVs and screens made the jump from 60 to 120 and not for example 90, to which there is no definitive answer.
At some point it was simply decided, we will never know if the answer is "because you can extrapolate easier" or if it's "easier to implement/build" or perhaps even "gave the best increase in perceived smoothness compared to 75/90hz".

We'll most likely never know, and I don't think there is a definitive answer to this.

Remember, we even had/have 48fps movies (HFR).

See my answer above
 

01011001

Banned
the only good thing about the hobbit movies were that they tried to push start 48fps. shame it didn't work out.

48 is also shit. it's even worse actually if your TV doesn't support it.

30 or 60, anything else should be discarded for video production of any kind imo.

the fact that 50hz cameras still exist gives me headaches
 
Last edited:

CuNi

Member
48 is also shit. it's even worse actually if your TV doesn't support it.

30 or 60, anything else should be discarded for video production of any kind imo.

the fact that 50hz cameras still exist gives me headaches

I agree.
I also don't understand why cinema sticks to 24hz and/or why media/games aim for 30fps.
Furthermore, I get it that this has historic reasons, back in the day when bandwidth was very limited, and you had to have cutbacks, but nowadays, it's literally just "because it's always been like that".
 

nkarafo

Member
Most TVs/monitors are fixed 60hz. That means they can handle multiples in that range. 20fps, 30fps or 60fps. If you try, say, 40fps on a 60hz fixed panel you are going to get stutters.

Basically, when a game runs at 30fps, it means it displays the same frame twice. At 20fps, it displays the same frame three times. At 60fps you get 60 unique frames. But at 40fps some frames will be skipped or repeat less/more times than others because you can't have perfect sync. And that inconsistency causes stutters.

A 120hz TV can do 40fps though (each frame x 3) so you can add that option in the above list, along with 120fps if you can handle that. My 240hz monitor can also do 80fps.

Keep in mind that in Europe (PAL) the standards were different. There you had slightly higher resolution 50hz TVs, so if a game couldn't run at 50fps it would have to run at 25fps. 20fps NTSC/US games would have to run at 17fps odd rates, which isn't perfectly in sync but at such low fps, 1 skipped frame would go unnoticed. Also, i assume most TVs should also support a "hidden" 24hz function, so 24fps movies can be displayed smoothly. But i'm not really sure about that. Maybe we get some skipped frames in movies all this time and we don't notice it because the frame rate is low anyway.

All these numbers are fixed frame rates for fixed refresh rates. However, if your TV/monitor supports VRR, you don't have to follow these rules because VRR panels can sync their refresh rates to whatever frame rate the content runs.
 
Last edited:

Soodanim

Gold Member
It's time for my PAL brothers to wake up and represent shorter intervals.

25/50 is objectively worse for smoothness than 30/60, but if the world stuck with PAL's speeds I wonder if we would seen more games where 50fps was reached in situations where games ended up releasing at 30 in the 30/60 choice.

Outside of 25 being objectively too slow (and that's a big point, especially for modern action) I think it would have been just fine if not better in modern times when you consider what's happening. You have 40fps modes introduced, where people here say it's so much better than 30 that they're happy to stick with it over 60 because of the happy medium. Then when we talk about multiples for higher framerates, it gets better. The multiples are easier to hit, so once you introduce VRR and high refresh, those new targets (maybe even a 75hz mode for those lucky enough) are much easier to hit. I don't know about any of you, but for me 75 is a big improvement over 60 yet past 90 I find it much harder to tell the difference. I would find it difficult to guess if a framerate was 100 or 120.

But that's just fantasy booking, so it's irrelevant. PAL image quality > NTSC though.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Its just about standards.
(There are governing bodies who come up with these standards like the Society of Motion Picture and Television Engineers SMPTE (im sure youve seen this initialism somewhere))

120Hz is a good number to choose cuz based on prior standards alot of them divide evenly into it.
24, 30 and 60.

Youll notice some Playstation games have 40fps mode, again thats just because 120 became a standard and 40 divides into 120 evenly.

If TV manufacturers made a 90Hz TV:
30 and 45 would divide evenly, but 60 and 24 would be out.
Movie studios really want to have 24 support so 120 was all but a given, 48 and 72 are "wanted" too whether they catch on is anyones guess.


In reality VRR panels are gonna be the norm and as gamers we wont care too much about what the standard is anymore cuz our games will run at whatever the devs decide.
Im actually guessing future media players will have VRR support as well so if a movie is shot at 72 it will play at 72, if the studio wants 48, 48 it is.

In the PC space 120Hz was never really a thing(cept for 3D monitors I guess), we kinda skipped straight to 144hz as the "standard".
Hell I think 75Hz monitors were more prevalent that native 120Hz monitors.
 

UltimaKilo

Gold Member
Give it a few years as hardware improves (maybe 7-8) and VRR becomes more widespread along with improvements in DLSS type advancements.
 

Panajev2001a

GAF's Pleasant Genius
"Why did the industry chose 24 fps for movies and 30/60 for games and not 40 fps?"
… and thankfully, with modern displays those that want quality and a bit better framerate also have 40 FPS as a baseline :). See what Insomniac and others are doing.
 
Without VRR, 90fps would not be a good experience on a 120hz TV.

The framerate needs to neatly fit into the refresh rate, so for a smooth gameplay experience on a 120hz TV your options are: 30fps, 40fps, 60fps, and 120fps.

Now.. With VRR this changes everything. In fact I'd love to see more console games ditch v-sync entirely and just let us play these games at fully unlocked framerate with a cap of 118 (to keep VRR enabled at all times, you sometimes get a judder when VRR engages and disengages at or near the max refresh rate of the screen). You would get improved smooth gameplay, and a faster response time due to the removal of the latency v-sync induces.

It makes me crazy that we haven't seen this standardized in console gaming yet.
 
Last edited:
Top Bottom