• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Frame Latency - Why you should stop talking about FPS

mkenyon

Banned
Feb 9, 2010
19,031
0
0
***03.27.2013 updated with amazing article from PC Perspective that anyone who wants to talk about game performance needs to read.***

So I thought I would put this out there for something folks should know and learn about as it is becoming more and more popular for hardware websites to use frame latency as a metric when talking about game performance.

What is frame latency?

When a frame is rendered, it takes a certain amount of time to do so. That time is measured as the latency, from 0, to render and display the frame.​

How is this different than frames per second?

Frames per second polls data once every second. The tools to record this information say, "in this one second of time X frames were rendered". In a way, it is an average of every frame latency value over the course of one second. As anyone familiar with statistics knows, this has the issue of essentially covering up information that would otherwise stand out as problematic.

For reference, 8.3ms = 120 fps, 16.7ms = 60 fps, 33.3ms = 30 fps.

Here is one second of gameplay in Dota 2, with the frame time of each frame displayed:

In a normal frames per second benchmark, this data would simply be listed as "74.6 frames per second".

We begin to see why this is problematic. During that second, frames were rendered over 16.7ms, some frames were rendered near 10ms. The large changes even over the course of one second is what can lead to stuttery gameplay, or where the game seems to just completely crap out.

Ultimately, there are too many frames in a second to really evaluate how consistent a frame rate really is and how smooth it feels using the standard FPS metric.

Why has frames per second been the standard metric of benchmarking for so long?

It's easy to understand from the layman's perspective. It's also incredibly easy to benchmark and to sort data. In a one minute benchmark you only have 60 points of data as compared to thousands with frame latency.

Most importantly, when something becomes the standard, it's very easy to get stuck in the same mind set without really asking whether or not this is accurate.​

How is this being presented in a format easily understood?

This metric is relatively new, and people are still working things out. TechReport currently has the most extensive and thorough data. Their key charts display:

99th Percentile - This shows in milliseconds how long one can expect 99% of all frames to be rendered. This is more or less a more accurate assessment of 'Average FPS'.

Frame Latency by Percentile - This gives you a good breakdown of the entire range of frames rendered by placing them in a grouping. A picture is worth a thousand words, credit to TechReport for the below graph.

Frames rendered beyond X, time spent beyond X - This focuses entirely on the outliers by showing you how much time or how many frames were spent above a given frame time. I've personally focused on 11.1ms, 16.7ms, and 33.3ms, as I see this as a good way to give accurate data and to see where things start to break down along the way.


How does one find out frame latency?

An easy way to view frame time is by checking the box in Fraps that puts a time stamp on every single frame that was rendered during a benchmark. There are a few tools out there where you can just load the .csv file into it, and it'll give you some neat line graphs. FRAFS Bench Viewer is fairly easy to use.

In order to look at the data in a number of different ways, you have to output the data to a spreadsheet. Once in, you simply subtract the prior data point to arrive at the difference between the two, which is the frame latency.​

Where can I find out more?

Inside the Second - This is the first article to talk about possible ways of playing with frame time data as well as an in depth view.

Gaming Performance with Today's CPUs - This is the first article to take the above methods and apply them to a comprehensive round up.

7950 vs. 660Ti Revisited - This was essentially the "shots fired" article. AMD had a big driver update in December that seemed to indicate that AMD was having it's way with NVIDIA. The 7950 was reportedly outperforming the 670, the 7970 was top dog. This took that consensus and shook it up a bit.

As the Second Turns 1, As the Second Turns 2 - These are an overview, Q&A, and updates on the hubbub.

A driver update to reduce Radeon frame times - AMD admits that they weren't internally testing for frame latency, instead focusing on those big FPS numbers. They released a very timely hot fix to being to deal with the issue. The speed of delivery, transparency, and effectiveness of the patch was pretty incredible.

PC Perspective investigates the matter further using a frame overlay to examine the actual final video that the player sees. This might be the most important article regarding game performance. Incredibly in depth.​

So, is this something we should really care about?

Yes. It would be best to more or less stop talking in terms of frames per second. While it can be accurate at times, it can obfuscate when things go bad or really well. In order to help move beyond the era of poor testing methods, GAF would really be doing a service if we started talking about game performance in frame latency/frame time instead.​
 

Oppo

Member
Jan 22, 2010
17,541
0
925
Toronto
There doesn't seem to be any consistent shorthand for comparing such latencies between games though. You need a graph to represent the data, do you not?
 

mkenyon

Banned
Feb 9, 2010
19,031
0
0
There doesn't seem to be any consistent shorthand for comparing such latencies between games though. You need a graph to represent the data, do you not?
I'm not sure I understand what you're asking. Can you elaborate?
 

mkenyon

Banned
Feb 9, 2010
19,031
0
0
Howmuch have the latest AMD drives reduced frame latency?
Pretty significantly. TechReport has done a few reviews since then.

I have a ton of data on a number of esports titles I'll be redoing with the new drivers as well, and I'll report back when I have it together.
Does downsampling add latency?
This is no different than FPS in what it represents, it's just more accurate in that it's the raw data rather than an average over the course of a second. So, yes, downsampling is harder on your PC to render, so it will take longer than it would have otherwise.
 

LCGeek

formerly sane
This is only starting to be brung up in general when benchmarking. I'm all for it MS plus FPS can clearly show which cards are rendering as fast and consistently as some companies claim. I had to go back on my 4850 and compared to my last two nvidia cards this is one area I notice a lack of performance.

While I'm all for this benchmarking until most reviewing sites deal with the overhead or aero/desktop issues that can induce stuttering I think a lot of tech sites need to deal all of this aspect. High precision event timer alone and how it's basically left at crap at clean or fresh install can give people a ton of problems.

Running my kernel at .5 ms with 13mhz performance counter vs default of 15ms at 2 or 3mhz makes a huge difference in how games feel and real time programs feel. If you want more info check out the link here.
 

-KRS-

Member
Sep 9, 2009
4,875
0
0
Sweden
enlisy.com
How does a monitor affect this? Like say a TFT vs a CRT. Or is it purely the game we're talking here and frame latency is not affected by monitors?
 

SneakyStephan

Banned
Jan 23, 2011
18,402
0
0
As AMD user I think this should replace fps in every benchmark and card review ever.
It's a way more meaningful metric.

So sick of the hopelessly uneven framerates in tons of games on my amd card.
 

TheExodu5

Banned
Nov 27, 2007
38,097
2
0
Ottawa, Canada
One question: is it possible for a frame to take longer than 16.7ms to render at 60fps with vsync? I assume it can't. If it did it means it would miss the window for the screen refresh, so I assume that should imply that it did not maintain 60fps.

Briefly compare two games using frame latency rather than FPS, so I can see how you'd express that difference.
The Dota 2 graph is the most visually representative graph, to me. The evenness of the graph shows how smooth the experience should be. I'd maybe add a line at 16.7ms to highlight whether the game maintains 60fps as well.
 

Sowee

Banned
Dec 13, 2012
145
0
0
Is it a thing in techology to change the approach from 'the more, the better' to 'the smaller the better'? I've seen so many sites out there measuring graphics capabilities this way, and in fact I like it more.
 

Perkel

Banned
Oct 2, 2010
9,673
0
0
Poland
Yeah that should be standard. I often see games hitting 60FPS and i easily notice those small hickups that should not be there.
 

mkenyon

Banned
Feb 9, 2010
19,031
0
0
One question: is it possible for a frame to take longer than 16.7ms to render at 60fps with vsync? I assume it can't. If it did it means it would miss the window for the screen refresh, so I assume that should imply that it did not maintain 60fps.
I don't know, to be honest. I've never tested with vsync on. It's pretty easy to do yourself, the tools are listed in there. Give it a whirl!
 

kpjolee

Member
Oct 6, 2012
1,229
0
475
U.S.
I didnt care much about latency as I thought FPS would be most important thing..but recent AMD issues made me care about it more.
 

Felix Lighter

Member
Oct 2, 2007
20,750
1
0
AMD's recognition of the issue was surprisingly refreshing considering how cut throat the competition is and how easy it still is to point to fps. It's a good sign for the future.
 

mr2xxx

Member
Mar 12, 2011
8,187
164
580
Interesting, I looked at a previous link posted and it made it seem that frame latency was only useful for looking at SLI performance since it reveals micro stuttering but the 660ti link makes its useful for single GPUs as well.
 

brogan

Neo Member
Mar 5, 2012
136
0
0
England
Nope, too confusing for me this. I will have to stick to FPS. However, I see your point in that websites should include this in the benchmarks to give a more accurate result/conclusion.

Good work on the OP too.
 

deviljho

Member
Jun 24, 2012
2,412
0
0
Northeast US
There doesn't seem to be any consistent shorthand for comparing such latencies between games though. You need a graph to represent the data, do you not?
There is. It's called standard deviation.

Measuring the standard deviation of frame rendering times would tell you what you want to know.
 

Visualante2

Member
Oct 20, 2011
14,631
0
0
Yeah it's too confusing, maybe you can get Khan Academy to simplify the explanation or soemthing :/ that usually helps me understand math problems.

I already feel out of my depth in most PC performance threads, I have no understanding of the different anti aliasing techniques, most PC game settings are way over my head, especially the way they stack. I actually see an argument for simpler graphics, and leaving all the high end stuff for .ini files.
 

mkenyon

Banned
Feb 9, 2010
19,031
0
0
I didnt care much about latency as I thought FPS would be most important thing..but recent AMD issues made me care about it more.
It's the same thing, but more accurate. This is the central point that people need to understand.
 

Felix Lighter

Member
Oct 2, 2007
20,750
1
0
It's just time per frame instead frame per time. Also instead of looking at the average, you're analyzing the consistency.

There are too many frames in a second to really evaluate how consistent a framerate really is and how smooth it feels.
 
D

Deleted member 125677

Unconfirmed Member
So .. "16,7ms FL now!" will be the new war cry of all the "60 FPS or bust" guys?
 

brogan

Neo Member
Mar 5, 2012
136
0
0
England
It's just time per frame instead frame per time. Also instead of looking at the average, you're analyzing the consistency.

There are too many frames in a second to really evaluate how consistent a framerate really is and how smooth it feels.
And suddenly I understand!
 

Visualante2

Member
Oct 20, 2011
14,631
0
0
It's just time per frame instead frame per time. Also instead of looking at the average, you're analyzing the consistency.

There are too many frames in a second to really evaluate how consistent a framerate really is and how smooth it feels.
I get that from second reading but the visualisations are still too confusing. And I'm not sure I want to be aware of the issue. Like before I was aware of screen tearing I didn't notice it or care... now it bothers me. Will this just be another thing to ruin my enjoyment of games? I don't have the time or patience to sit tweaking my game for hours to get slightly better performance. I appreciate some do, but from an Average Joe perspective, frame rate probably is the most you can expect someone to care about. Perhaps it will help better inform purchasing decisions but again the graphs have to be simpler for that to happen.
How does a monitor affect this? Like say a TFT vs a CRT. Or is it purely the game we're talking here and frame latency is not affected by monitors?
Monitor would add time onto the display rate, right? So in theory you could have graphics hardware that is pumping out frames faster than your monitor can display them. Guessing here.
 

Cymbal Head

Banned
Nov 4, 2005
4,406
0
0
So essentially this method allows you to look at not only how quickly the hardware is pumping out frames (which FPS does too), but also how consistently it does so?

Seems useful.

Edit: There needs to be an easily understandable shorthand for this if it's going to catch on. Something like: 16.7±5ms FL

Does that capture the necessary information?
 

Addnan

Member
May 16, 2011
8,517
1
520
London
Interesting write up, didn't even know such a thing was a thing(?) So this is how long it takes for 1 frame to render?

Is this the graph? 5min match of BF3, all settings max, 64players.
 

stryke

Member
Oct 1, 2011
10,922
1
510
I don't what to believe in anymore.

Joking aside, that was really informative. Thanks for dropping some knowledge.
 

deviljho

Member
Jun 24, 2012
2,412
0
0
Northeast US
yeah, wouldn't you just want as small a distribution as possible (0 would be perfect buttery smoothness) so you don't get stuttering?
You're right in that you'd want 0 dispersion between the frame rendering times. But we're using the term "distribution" here to refer to certain models of how random events occur.
 

jonezer4

Member
Apr 28, 2006
1,887
0
0
yeah, wouldn't you just want as small a distribution as possible (0 would be perfect buttery smoothness) so you don't get stuttering?
I misspoke. You wouldn't need a normal distribution, but you would need to know the mean. Standard deviation alone wouldn't be enough: a game with a stdev of 0, sounds great... but not if it's running at a steady 100ms/frame.