I would assume
this is it.
Yeah, it seems to be another arbitrary synthetic benchmark with such poorly defined categories as "multi core floating point speed" and "quad core integer speed". I wouldn't put a whole lot of faith in its results.
That's usually how people define "minimum FPS". (And yes, it does make very little sense, since the arbitrary "second" discretization it introduces can have a significant impact on the final number while having no relation to the game experience).
That's why I asked. The aforementioned discretization makes it an incredibly vague measure of the smoothness of a gaming experience (which is ostensibly the point of providing the figure in reviews). I'd almost just take the average FPS over it, as at least the mean is statistically well-defined.
That's actually one of the metrics techreport uses (or at least close, I don't think they use percentages). The present the frames over 50 ms, 33.3 ms, 16.7 ms and 8.3 ms. So you get a good idea of severe stutter (>50) and performance consistency at 30, 60 and 120 FPS.
Yep, TechReport is definitely my preferred site for these things. The only issue I have with them, as you mention, is that they use "time spent over X ms" rather than "percentage of time spent over X ms". Aside from the latter being more easily relatable, the benchmarking runs can vary a little bit in duration, which would get normalised when reporting as a percentage. In fact, thinking about it I'd almost argue that it should be inverted, as "percentage of time spent
under X ms", just from a visual design point of view as it would give you a better idea of what you're losing, proportionally, going from one piece of gaming hardware to the next, without having to refer to a scale.
Edit: They also tend to ignore short-term frametime variance (i.e. microstutter), but then again everyone seems to ignore that outside of multi-GPU analyses.