• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Frame Latency - Why you should stop talking about FPS

SapientWolf

Trucker Sexologist
Really glad this thing has reared its ugly face, now every time I mention the word "SLI", 15 seconds later there's a "OMG LATENCY" comment, really is beautiful.

Especially because I can't feel the difference whatsoever. You could sit me down in front of a new generation single and SLI PC and promise me eternal wealth and I would never be able to tell.
In my experience, going under 60fps/60hz with an SLI/Crossfire setup is remarkably bad, but it's fine if you're over that threshold. I think the inconsistency in frametimes is less perceptible at a high framerate.
 

ElFly

Member
It'd be more intuitive to people if we talked about standard deviation as measured in FPS (aka, 1/latency)

People are not used to thinking about the amount of time each frame takes being drawn, and such small numbers as milliseconds won't really impress people.

Instead, if you say that the framerate is 60fps but with stuttering to 30fps people won't be able to say the framerate is solid.
 
It'd be more intuitive to people if we talked about standard deviation as measured in FPS (aka, 1/latency)

People are not used to thinking about the amount of time each frame takes being drawn, and such small numbers as milliseconds won't really impress people.

Instead, if you say that the framerate is 60fps but with stuttering to 30fps people won't be able to say the framerate is solid.

I was just trying to articulate something similar. Is it possible / does it make sense to average the frame times and translate them back into a "true (equivalent) average fps"?
 

mkenyon

Banned
But saying "60 frames per second" and "16.7ms framerate" is the same thing.

The issue is that looking at FPS obfuscates where things are going wrong. There's been a long standing issue in PC Hardware where AMD processors always put out decent FPS numbers. But from my personal experience in the transition from Phenom II X4 to Sandybridge, I knew there was something more to it than that.

Additionally, overclocking CPUs are generally not reported as any significant gain in FPS for benchmarks when you're talking about high settings and 1080p, yet it is very easily seen when you look at frame latency data.

It doesn't matter if it's harder to begin to understand the data, because FPS is just too inaccurate. I don't really think it's very difficult at all either. A single number representing the 99% frame time is a great replacement for average FPS, for example. Number of frames above X is a great replacement for minimum FPS. The difficult part is the transition since people have been stuck in the FPS way of thinking for way too long.
 

methane47

Member
I was just trying to articulate something similar. Is it possible / does it make sense to average the frame times and translate them back into a "true (equivalent) average fps"?

An average will always leave room for large discrepancies. And the smaller you go in sample size the lesser the ability to humanly process the results in real time.

I think a Std dev next to the FPS counter would be fine.

so 60/0 would be Perfect world/Heaven

While 60/2 would indicate MicroStutter

Something like that.
 

deviljho

Member
But saying "60 frames per second" and "16.7ms framerate" is the same thing.

I think a Std dev next to the FPS counter would be fine.

The best convention, IMO, would be to simply have 2 numbers like methane47 suggests. But not FPS/Stdv. Just show mean and variance (or standard deviation) and through use, everyone who cares will know what the good and bad numbers look like. The units are the same and you'd just aim to keep them both low as opposed to a high FPS / low stdv. Plus, it is tough to tell what a good stdv. is if you don't have the mean right next to it.

So looking at 16.7ms / 2ms as an example - and you'd want to keep both numbers low.

I was just trying to articulate something similar. Is it possible / does it make sense to average the frame times and translate them back into a "true (equivalent) average fps"?

not really. consider a two frame scenario A with render times to 10ms and 50ms. The average is 30ms

now consider scenario B with render times 28ms and 32ms...
 

mkenyon

Banned
The best convention, IMO, would be to simply have 2 numbers like methane47 suggests. But not FPS/Stdv. Just show mean and variance (or standard deviation) and through use, everyone who cares will know what the good and bad numbers look like. The units are the same and you'd just aim to keep them both low as opposed to a high FPS / low stdv. Plus, it is tough to tell what a good stdv. is if you don't have the mean right next to it.

So looking at 16.7ms / 2ms as an example - and you'd want to keep both numbers low.

not really. consider a two frame scenario A with render times to 10ms and 50ms. The average is 30ms

now consider scenario B with render times 28ms and 32ms...
Interesting ideas. I'll put my data into this format later tonight and show what it looks like.

And yeah, averaging gets back to the main issue with polling in FPS. I still think the 99th Percentile (or whatever is most accurate to describe it) is the best number to use to represent average.
 

Alexios

Cores, shaders and BIOS oh my!
I think that if there's stutter that makes the game not feel smooth they aren't going to praise it for its fps and only mention the stuttering, unresponsive controls, or whatever else it causes. When a game is praised for its fps it's probably stable enough to make it great, otherwise you'd be hearing about the complaints first. Well, that's what I think...
 

mkenyon

Banned
I think that if there's stutter that makes the game not feel smooth they aren't going to praise it for its fps and only mention the stuttering, unresponsive controls, or whatever else it causes. When a game is praised for its fps it's probably stable enough to make it great, otherwise you'd be hearing about the complaints first. Well, that's what I think...
The subjective portion of the review always has and always will be an integral part of the review. But, we can have our cake and eat it to. This is a way to objectively demonstrate a full analysis of performance.

It also helps keep people honest, because they can't gloss over details. Or, they can go the Tom's route and just change the metrics to conjure up some spicy results that gets clicks.
 

Alexios

Cores, shaders and BIOS oh my!
Ah... I was talking about game reviews... Didn't know hardware can cause this... How can you separate what stutter is because of the hardware and what because of the software though (when the latter can cause it on specific hardware only too)...
 

mkenyon

Banned
If the same issue persists across multiple systems despite different hardware, it's pretty easy to see that it is software related. Haz and I both did some testing in Firefall recently and found some huge spikes in frame latency that were happening exactly 10.2 seconds apart.

fQZOR.png


9b0QD.png

We tested it with different GPUs (670, 7970), albeit similar processors (2600K for him, 3570K for me). The devs found the data interesting and it was determined to be software related.
 

Durante

Member
Which is funny because I'm an excellent player when it comes to FPS games and most people wouldn't notice SLI frame latency whatsoever. However, now it's suddenly a 'thing' and there's graphs and write-ups everywhere, now everybody suddenly feel a 3 day delay and are professors in frame latency.
I don't know what you were reading, but people have noticed (and been complaining about) SLI microstuttering ever since AFR was invented. The only difference is that we now have some good metrics to quantify it.

If anything, this should help multi-GPU setups to become better in practise, recent NV drivers already show more stable frame times at least with 2 GPUs. When people are only looking at FPS, only FPS will be optimized.
 

mkenyon

Banned
Cross post -

Listening to tech report podcast right now, Scott Wasson says that AMD has confirmed there will be a patch relatively soon with a new memory management piece for all GCN based chips which will mean a general performance increase across all DX10/11 titles.

So awesome.

Looks like this new movement is really helping push the industry in a very good direction. Lovin' it.
 

Grayman

Member
Good read.

I was aware because Warsow would dump this data to console after a benchmark. I like it as a tool for looking at longer frametimes / lower framerates.
 

dark10x

Digital Foundry pixel pusher
Which is funny because I'm an excellent player when it comes to FPS games and most people wouldn't notice SLI frame latency whatsoever. However, now it's suddenly a 'thing' and there's graphs and write-ups everywhere, now everybody suddenly feel a 3 day delay and are professors in frame latency.
It's something I've noticed for years upon years but have been unable to explain what I was seeing. Suggesting that some of us would never have noticed this without these articles is bullshit, plain and simple.

I'm glad that this issue has been exposed simply because it might actually help solve it. When so few others were noticing it I felt helpless as, in some cases, it was nigh impossible to solve on my own (as the problem driver or software related).
 

Ledsen

Member
So this is stacked on top of the tv's response time too huh. The best hdts have around 4ms right?

No, they are not the same at all. Frame latency is milliseconds per frame and is related to fps, which is frames per second. The main advantage of

1) a low fl or a high fps is perceived smoothness in motion.

2) a consistent fl (the fl is roughly the same for each frame) is perceived consistency in motion (lack of "stutter" and "jerkiness").

This second point cannot be accurately measured with fps, only with fl.

TV latency applies equally to all frames and therefore does not impact your perception of the smoothness or consistency of the image.The main advantage of a low TV latency (response time) is lack of input lag.
 

foogles

Member
I think one of the important things about this is that this is not something that can't be seen and that can only be measured in graphs. Get the wrong kind of GPU setuip on the wrong kind of game, and you can easily see it. I saw it in Fallout 3 with my 4850 Crossfire setup (but didn't know what it was back then), I saw it in Skyrim and New Vegas with my GTX460 SLI setup, and according to that gameplay video that was slightly slowed down, you can pretty easily see and feel it in games like Skyrim with the 7950. Now, sure, all of these examples are the most obvious inside the rather geriatric GameBryo/Creation engine, but hey, not every game is made on the most efficient and compatible engines out there. Even if you don't like Bethesda games, something's going to come along that only delivers smooth frametimes on the cards and drivers that specifically are made for reducing that - if AMD and Nvidia aren't vigilant, this could become a real problem again.

I'm sure plenty of people will stick their fingers in their ears and say BETTER FPS LALA I CAN'T HEAR YOU, but luckily at least both AMD and Nvidia are now keeping this in mind. GPU developers make their products to sell, and in the past they have most definitely optimized their cards for best frame rates since that's been the biggest metric for measuring performance, but now there's a new metric to make sure that they're consistent. You don't want consistency? That's cool, don't demand it.
 
Other thing about latency:

With the Wii U many people first recognize how lazy most modern TVs are. At first some people thought that it was a error in the Wii U.

The Wii U sends the video/audio-stream to the Gamepad practically without latency and 60 FPS. You could easily recognize the latency when both, Gamepad and TV, play the same sound. Of course this latency appears on all consoles/pcs that are hooked to a TV, but with the Gamepad it is the first time many people recognize this.

All LG-TVs has a big latency because you can not turn off the post processing fully. And even some TVs where you can really reduce latency with the preferences has a noticeable latency.
 

Ledsen

Member
I think one of the important things about this is that this is not something that can't be seen and that can only be measured in graphs. Get the wrong kind of GPU setuip on the wrong kind of game, and you can easily see it. I saw it in Fallout 3 with my 4850 Crossfire setup (but didn't know what it was back then), I saw it in Skyrim and New Vegas with my GTX460 SLI setup, and according to that gameplay video that was slightly slowed down, you can pretty easily see and feel it in games like Skyrim with the 7950. Now, sure, all of these examples are the most obvious inside the rather geriatric GameBryo/Creation engine, but hey, not every game is made on the most efficient and compatible engines out there. Even if you don't like Bethesda games, something's going to come along that only delivers smooth frametimes on the cards and drivers that specifically are made for reducing that - if AMD and Nvidia aren't vigilant, this could become a real problem again.

I'm sure plenty of people will stick their fingers in their ears and say BETTER FPS LALA I CAN'T HEAR YOU, but luckily at least both AMD and Nvidia are now keeping this in mind. GPU developers make their products to sell, and in the past they have most definitely optimized their cards for best frame rates since that's been the biggest metric for measuring performance, but now there's a new metric to make sure that they're consistent. You don't want consistency? That's cool, don't demand it.

Bethesda games have a long-standing Gamebryo bug that makes their games run in 64hz instead of 60, and the update frequency mismatch between game and monitor causes the massive microstutter in those games. I'm pretty sure it can happen on any setup, and maybe even happens to everyone (although not everyone will notice it). Each game has had mods such as "Stutter Remover" to fix the bug.
 

dark10x

Digital Foundry pixel pusher
Other thing about latency:

With the Wii U many people first recognize how lazy most modern TVs are. At first some people thought that it was a error in the Wii U.

The Wii U sends the video/audio-stream to the Gamepad practically without latency and 60 FPS. You could easily recognize the latency when both, Gamepad and TV, play the same sound. Of course this latency appears on all consoles/pcs that are hooked to a TV, but with the Gamepad it is the first time many people recognize this.

All LG-TVs has a big latency because you can not turn off the post processing fully. And even some TVs where you can really reduce latency with the preferences has a noticeable latency.
My setup is fast enough where the two produce images and sound at roughly the same time (not noticeable difference), but I have noticed this issue when using it in conjunction with some other setups out there. In a few cases, one could detect a full second difference between the Wii U gamepad and the TV.
 
This gives fighting game fans another thing to bitch about on top of TV frame lag.

Something with consistent latency measured in single digit ms is impossible for me to notice. I notice inconsistency or stutter really easily, but I guess that's just how eyes work.

Edit: Essentially beaten horribly on all points
 
What I don't understand is why we need to stop talking about the FPS and start talking about delta times. It's established in the OP that the two are different ways of describing the same thing, so what benefit is there to switching?
 

Ledsen

Member
What I don't understand is why we need to stop talking about the FPS and start talking about delta times. It's established in the OP that the two are different ways of describing the same thing, so what benefit is there to switching?

I'll quote myself:

No, they are not the same at all. Frame latency is milliseconds per frame and is related to fps, which is frames per second. The main advantage of

1) a low fl or a high fps is perceived smoothness in motion.

2) a consistent fl (the fl is roughly the same for each frame) is perceived consistency in motion (lack of "stutter" and "jerkiness").

This second point cannot be accurately measured with fps, only with fl.

TV latency applies equally to all frames and therefore does not impact your perception of the smoothness or consistency of the image.The main advantage of a low TV latency (response time) is lack of input lag.
 
I'll quote myself:

But the two measurements are directly convertible! You can derive the delta time between frames given the framerate of a particular moment and vice versa.

There seems to be this notion that a framerate measured in FPS can only be an average of values over a second, which is just wrong. It's like saying you can only describe the speed of a vehicle in MPH after it's been travelling for an hour.
 

dionysus

Yaldog
No, they are not the same at all. Frame latency is milliseconds per frame and is related to fps, which is frames per second. The main advantage of

1) a low fl or a high fps is perceived smoothness in motion.

2) a consistent fl (the fl is roughly the same for each frame) is perceived consistency in motion (lack of "stutter" and "jerkiness").

This second point cannot be accurately measured with fps, only with fl.

TV latency applies equally to all frames and therefore does not impact your perception of the smoothness or consistency of the image.The main advantage of a low TV latency (response time) is lack of input lag.

But if we talk about TV refresh rate it is very much related to stutter and smoothness. For example, if you have a large deviation in frame latency from the computer, you could have entire frames skipped because they fall between monitor refreshes, and then a few milliseconds later the same frame displayed twice in a row if the latency is then longer than the refresh rate of the monitor.

That is why consistency is so important and having your latency always be a whole multiple or power of 2 fraction (1/2, 1/4) of your monitor refresh rate. If you had consistent 45 fps (22.2 ms), you still get stutter on a 60 Hz monitor. It would be better to be at 33ms on a 60 hz monitor.
 

mkenyon

Banned
What I don't understand is why we need to stop talking about the FPS and start talking about delta times. It's established in the OP that the two are different ways of describing the same thing, so what benefit is there to switching?
From the OP

How is this different than frames per second?

Frames per second polls data once every second. The tools to record this information say, "in this one second of time X frames were rendered". In a way, it is an average of every frame latency value over the course of one second. As anyone familiar with statistics knows, this has the issue of essentially covering up information that would otherwise stand out as problematic.

For reference, 8.3ms = 120 fps, 16.7ms = 60 fps, 33.3ms = 30 fps.

Here is one second of gameplay in Dota 2, with the frame time of each frame displayed:

3I7YHcp.png


In a normal frames per second benchmark, this data would simply be listed as "74.6 frames per second".

We begin to see why this is problematic. During that second, frames were rendered over 16.7ms, some frames were rendered near 10ms. The large changes even over the course of one second is what can lead to stuttery gameplay, or where the game seems to just completely crap out.

Ultimately, there are too many frames in a second to really evaluate how consistent a frame rate really is and how smooth it feels using the standard FPS metric.

FPS data is polled every second, and as a result, it can hide data where things go awry.

This is the typical chart you would see to talk about performance, look at how close the numbers are for each of the MHz steppings:


Here's FPS over time, again with the data polled and averaged every second:


If you take a look at it frame by frame rather than averaging it every second, you get this:


The number of frames going over a certain value can double at times. That's a huge difference. Now lets look at what effect this has on 99th Percentile Frame Latency, which is more or less a more accurate version of "Average FPS"


That's a difference of nearly 20FPS between the highest frequency and the lowest frequency there.

Here's the thing. When game performance is going as it should, and there are no random issues, these numbers should generally line up with Average FPS really well. But, when things are going WRONG, FPS completely covers up the data by averaging out the bad stuff with the normal stuff over the course of a second.
 
Don't people already make comments along the lines of "The game runs at 60fps but it's not steady" though? How would this benefit casual discussions without a lot of data to draw from?
 

mkenyon

Banned
Because it's not just about inconsistency, it's about more accurate numbers. Avg FPS, for example, is an average of frame times over one second which is then averaged again across the entire benchmark. This creates ambiguous and inaccurate data.
 
Because it's not just about inconsistency, it's about more accurate numbers. Avg FPS, for example, is an average of frame times over one second which is then averaged again across the entire benchmark. This creates ambiguous and inaccurate data.

Now I'm following you. I finished reading the thread and my issue of how to really communicate this without resorting to detailed infographics in every OT have already been discussed.

Don't get me wrong, I'm just playing devil's advocate here. Thanks for taking the time to put this together.
 

ElFly

Member
The best convention, IMO, would be to simply have 2 numbers like methane47 suggests. But not FPS/Stdv. Just show mean and variance (or standard deviation) and through use, everyone who cares will know what the good and bad numbers look like. The units are the same and you'd just aim to keep them both low as opposed to a high FPS / low stdv. Plus, it is tough to tell what a good stdv. is if you don't have the mean right next to it.

So looking at 16.7ms / 2ms as an example - and you'd want to keep both numbers low.

Mmmm I am starting to think I was wrong and that this is the best idea.

Damn you gaf and your rampant banning of people.
 

mkenyon

Banned
Mmmm I am starting to think I was wrong and that this is the best idea.

Damn you gaf and your rampant banning of people.
Yeah, super shitty. I'd like to continue that conversation with him :(

Had to get some wife time in last night, have some dog/horse business to attend to this evening.

What I could do, is actually just give access to the raw data for people to play with if they desire.
 
FPS data is polled every second, and as a result, it can hide data where things go awry.

But it's not! It can be, depending on what software you're using to measure it, but there's no reason why it has to be! Like I say, you don't have to poll the speed of a car for an hour before you can say how many miles an hour it's traveling at.

By all means we should be looking at what's going on at a per-frame basis, but I think throwing out the de facto standard for describing framerate obfuscates the real issue.
 

mkenyon

Banned
But it's not! It can be, depending on what software you're using to measure it, but there's no reason why it has to be! Like I say, you don't have to poll the speed of a car for an hour before you can say how many miles an hour it's traveling at.

By all means we should be looking at what's going on at a per-frame basis, but I think throwing out the de facto standard for describing framerate obfuscates the real issue.
Frames per second is inherently a number that shows the number of frames rendered in a second. It can't function any other way.

When you poll that MORE than once per second, you are talking about frame latency.

Just because it is the standard doesn't mean it should continue to be used. I don't get how moving towards a more accurate methodology obfuscates the real issue. Elaborate a bit?
 

methane47

Member
But it's not! It can be, depending on what software you're using to measure it, but there's no reason why it has to be! Like I say, you don't have to poll the speed of a car for an hour before you can say how many miles an hour it's traveling at.

By all means we should be looking at what's going on at a per-frame basis, but I think throwing out the de facto standard for describing framerate obfuscates the real issue.

You gave a great analogy here but failed to use it correctly. Indeed you don't to poll a car once an hour to see how fast it is going. But the issue is Gamers dont care as much about average speed as the care about how smooth the drive was.

A car can drive 60 mph for 1 hour and cover 60 miles. Or a car can drive 10 min at 120 mph and then stop for 10 minutes and then drive 10 more minutes at 120 mph and then stop for another 10 min (For an hour) and cover 60 miles.

When you check the average speed for the hour for the 2 drives the mph will be the same over the course of the hour while one Drive was MUCH more smooth.
 

mkenyon

Banned
What do you think about this?

http://www.pcper.com/reviews/Graphi...ance-Review-and-Frame-Rating-Update/Frame-Rat

(CF = blank frames? Is tech report missing this? )
I'd like to see more once they have something more comprehensive.

There's definitely limits to multi-card testing, which is why TechReport has tried to steer clear of it for the most part. I do know that SLI does indeed smooth things over compared to what Direct X is reporting, so their finds do seem interesting.
 
Oh wow, sorry, totally forgot about this thread!

Frames per second is inherently a number that shows the number of frames rendered in a second. It can't function any other way.

When you poll that MORE than once per second, you are talking about frame latency.

Just because it is the standard doesn't mean it should continue to be used. I don't get how moving towards a more accurate methodology obfuscates the real issue. Elaborate a bit?

But you can get a frame-rate value from any two points of data, any distance apart! Who says you can only poll a frame-rate measured in FPS no more or less than once a second? I would post my car analogy again, but you seem to be ignoring it!

And it clouds the issue because there's no reason for it. Why does saying 'the 99th percentile frame time is 17.5ms' have any more meaning than saying 'the 99th percentile frame-rate is 57.1fps'? Based on frame-rate being a standard measure for so many years, I'd argue that the latter is more meaningful as to what it represents.
 

mkenyon

Banned
Oh wow, sorry, totally forgot about this thread!



But you can get a frame-rate value from any two points of data, any distance apart! Who says you can only poll a frame-rate measured in FPS no more or less than once a second? I would post my car analogy again, but you seem to be ignoring it!

And it clouds the issue because there's no reason for it. Why does saying 'the 99th percentile frame time is 17.5ms' have any more meaning than saying 'the 99th percentile frame-rate is 57.1fps'? Based on frame-rate being a standard measure for so many years, I'd argue that the latter is more meaningful as to what it represents.
I think I get what you are saying.

Pretty much, you mean, 'use the same numbers and methodology, but present it as Frames Per Second, rather than frame time'. If so, I think you might be on to something.
 
I think I get what you are saying.

Pretty much, you mean, 'use the same numbers and methodology, but present it as Frames Per Second, rather than frame time'. If so, I think you might be on to something.

Yeah, that was what I was trying to get across! Apologies for sounding exasperated, but I was frustrated that I couldn't accurately explain my point!
 

Cassius

Member
Did some testing on the Lost Planet 2 Benchmark (Part A). My system is old. HD 4850, 8 gigs of RAM and an Athlon II dual core.

Everything maxed at 1440x900

LrLBTP4.gif


CPU bound test with all graphical options bottomed out.

PjUaS6F.gif
 
Top Bottom