• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What's the minimum input lag a TV should have for games to be playable?

devilhawk

Member
Am I the only one impressed by this clerk? I've never been able to get them to tell me anything about a TV that wasn't written on the card in front of it.
Is it even correct? The dude might have been talking out of his ass or just replied with Hz number.

The only Samsung 4K LED TV i can find with that high of lag is 3 years old - which wouldn't be at Best Buy.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
Turns out that my monitor that was advertising 1ms was actually 13ms, I feel deceived.


To quote an old SRK thread:

"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.

Stop misreading the specs. TV manufacturers rarely if ever actually report input lag.

Am I the only one impressed by this clerk? I've never been able to get them to tell me anything about a TV that wasn't written on the card in front of it.

Is it even correct? The dude might have been talking out of his ass or just replied with Hz number.

Actually, yeah, that's a good point. I wonder where he got the numbers.
 

RoadHazard

Gold Member
60ms = 3.6 frames of lag (so I guess really 4) at 60fps, or 1.8 frames (so 2) at 30 fps. That's a lot if you're playing a game that requires responsive input.

To quote an old SRK thread:

"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.

Stop misreading the specs. TV manufacturers rarely if ever actually report input lag.

Yeah, response time is just how quickly the pixels change to their new state. Input lag is the total lag from the TV receiving an image to it being visible on the screen, and is always longer, due to image processing etc.
 

shockdude

Member

To quote an old SRK thread:

"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.

Stop misreading the specs. TV manufacturers rarely if ever actually report input lag.



Actually, yeah, that's a good point. I wonder where he got the numbers.
That site uses a different input lag testing methodology than everyone else. The monitor you linked has 3ms of input lag over a CRT. At 60Hz it will have 11ms average input lag.
 

Koren

Member
+1 to the fact that response time has nothing to do with input lag.

9 ms is the best you can do sadly. It's really pathetic.
What? How is this pathetic? You cannot really go under 9ms...

Remember how it works: at 60Hz, it takes more than 10ms to transmit the image. A 60Hz refresh of the screen takes 16ms. If you have 0ms of input lag on top of the screen, and you do a normal refresh, you'll have ~8ms of input lag in the middle. I must re-read the specs of the lag tester and how they compute the input lag (average of three bars?) but you need some really crazy tricks to go below 9ms...

Agreed, 20ms or less. If you're not familiar with the site, http://displaylag.com is a great resource for finding low lag displays.
This... I really don't like anything above 20ms...

But it's still a matter of taste.
 
It's pathetic because my CRT is still way better and if my PS4 came with component inputs I'd still be using it instead of this shitty 10ms Asus.
 

NEO0MJ

Member
To quote an old SRK thread:

"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.

Stop misreading the specs. TV manufacturers rarely if ever actually report input lag.

I know that now, no need to make me feel dumber ;_;

Still, I didn't have any other option locally, anyway. At least not something in a similar price range.

It's pathetic because my CRT is still way better and if my PS4 came with component inputs I'd still be using it instead of this shitty 10ms Asus.

I guess HDMI to component converters don't help?
 

shockdude

Member
It's pathetic because my CRT is still way better and if my PS4 came with component inputs I'd still be using it instead of this shitty 10ms Asus.
wat. Your CRT has at minimum 8.33ms average input lag; you're able to perceive a ~2ms difference in your Asus? Though to be fair you might be noticing pixel response time at that level.
 

Roge_NES

Member
Anything above 4 frames makes precision games like Street Fighter, Megaman or Dodonpachi quite harder or unplayable.

I did some testing not long ago on my current equipment and got some decent results.
1 frame = 16.66ms
tumblr_inline_ol3e9whxsr1qc5f82_540.jpg


tumblr_inline_ol3e9zztGS1qc5f82_540.jpg
 

dock

Member
I remember playing Rhythm Paradise on a friend's laggy TV and having a pretty miserable time. I soon as I played it on my own TV (a 22ms Sony Bravia) I had no problems. It was the lowest lag TV I could get in 2014, and sadly Sony's TVs became >30ms from 2015 onwards.

I'm curious about so many people here saying they need <10ms. Does this stem from people playing on monitors on desks? What are the cases where 10ms and 20ms feel quite different?
 

OuiOuiBa

Member
What are the cases where 10ms and 20ms feel quite different?
It will never feel "quite different", but one have to consider that TV / monitor display lag is delay added to the end of the controller->screen chain (which many components add both hardware delays and software delays). In other words, even if it is an unmeasurable difference itself (like 10ms or less difference), it is making things worse overall.

However, one could argue that many games unfortunately have built-in delays that are far higher than 10 to 20ms (read http://www.gamasutra.com/view/feature/3725/measuring_responsiveness_in_video_.php), which can make a 10ms difference almost insignificant...for those games.

Would a TV with 60ms of input lag be okay or not? If not, what's the minimum input lag a TV should have for video games to be playable and enjoyable? 50ms? 40? How much, exactly?
- A TV with 60ms display lag will never be okay. It may be okay to some people and/or for some games (games that have extremely low built-in lag, or that never requires fast / accurate input). I consider 30fps to never be okay either (though I deal with it), so take my comment with a grain of salt
- If you consider all games, then the minimum input lag a TV should have for video games to be playable and enjoyable is zero, because a few games have built-in delay that makes their controls not quite playable and enjoyable (see Gamasutra article about GTA IV and others, link above) even on a CRT, which is the fastest you could ever get
 
Remember you've got to deal with:

TV input lag.
Native game input lag
Controller input lag (if wireless it's worse)
Wi-Fi input lag
General network lag.

Let's take Tekken 7.

It has 8 frames input lag.
1 frame at 60 frames per second is 16.6 Ms
8 frames = 124 ms.
That's over 1/10th of a second.

Now let's take a TV with 60ms lag. 60/16.6 = 3.6 frames @ 60fps

So now we have 184ms or 11.6 frames of lag before you have pressed a button.

Now let's take your chosen input method.
It could add almost no lag to a whole frame or more
http://www.teyah.net/sticklag/results.html
Let's say it's adds 6.6 ms which equals .4 frames.

Running total is now 190.6 ms or 12 whole frames at 60 FPS
That's about 1/5th of a second.

Now let's play online. I'm assuming you used a wired connection, I don't have the figure for lag here so I'll be nice and assume 0.

You find a guy online to play he is 48ms ping.
Let's assume this doesn't fluctuate. That's 3 more frames.

238.6 ms lag. 15 frames.

Now I'm sure some of the sums are off up there and good netcode can hide it. But I'm trying to demonstrate it is a compounding effect.

For example the game can maintain the 8 frame buffer and when lag spikes delete half the buffer to compensate. I believe Soul Calibur V did this.

But every step you can take to reduce the lag will help you out when there is inherently going to be lag you cannot mitigate.

Buy chosing a TV with input lag in the 12- 15 Ms range you remove 2 and a half frames.
Having a better wired controller is another half frame of more.
 

phen0m24

Member
Try to keep it around 30-ish or less. Just read rtings for the info.

I went from a BenQ gaming monitor to a KS8000 and now to a LG C6 (yeah the wrong direction, I know). At first the C6 felt a liiiiitle sluggish (game mode is 34 ms) but it feels fine now.
 

laxu

Member
I'm curious about so many people here saying they need <10ms. Does this stem from people playing on monitors on desks? What are the cases where 10ms and 20ms feel quite different?

I currently have an ASUS PG278Q and a Samsung KS7005 (Nordic KS8000) TV. The ASUS has an input lag of a mere 4ms and the Samsung is about 22ms. I can't notice a significant difference in my performance when playing on either of these displays.

The lower the better of course but I'd say that for everything except frame-perfect games (like fighting games) or really fast first person shooters you are fine as long as the input lag is below 30ms.

Also remember that your receiver can also add a significant amount of input lag. My Denon AVR1610 was terrible with this and I've now gone to just routing all my inputs thru my TV and using optical to get audio to the receiver. I don't think there are any numbers published for receivers so you have to just test how yours fares.
 

TLZ

Banned
I'm playing at around 30 and I think it's good. Any more though and you'll feel it. Any less is awesome.
 

Jerry

Member
Get owned dude.

Acer's best model the 2016 R271 is 9ms.

https://displaylag.com/display-database/

This is the ONLY site that gives accurate non-BS #s and if you check you'll see nothing in over 500 tested monitors scores less than 9 ms.

I fail to see how I am 'owned' when that site includes time from things the monitor cannot control, e.g. button press -> rendering -> signal transmission.

To quote an old SRK thread:

"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.
"RESPONSE TIME" IS NOT INPUT LAG.

Stop misreading the specs. TV manufacturers rarely if ever actually report input lag.





Actually, yeah, that's a good point. I wonder where he got the numbers.



We can talk about semantics between input lag and response time but the OP for sure meant response time.
 

nkarafo

Member
Interestingly current tech is a step back from the old CRT monitors that had zero (0) input lag
My LCD TV is supposedly 30ms. THUMPER (a rhythm game) is practically unplayable on it. Once i started playing it on my old CRT monitor, i managed to do some progress.

Just about everything feels better on a CRT. Not only there's no delay to your input but you also get a sharp moving picture without any motion blur.
 
My LCD TV is supposedly 30ms. THUMPER (a rhythm game) is practically unplayable on it. Once i started playing it on my old CRT monitor, i managed to do some progress.

Just about everything feels better on a CRT. Not only there's no delay to your input but you also get a sharp moving picture without any motion blur.

CRT monitors are still the Gold Standard in regards to input lag and motion blur, next would be Plasma, some things have gotten worse with time
 

Stranya

Member
Wow so you're pretty much fucked with a low range Tv?
Actually, it's the opposite, at least in terms of cheaper models from Samsung etc. For years now, the lower range Sammys have far fewer processing bells and whistles than the top of the line models, and, as such, have far lower input lag. I have a 50' entry level model from 6y ago with less than 20ms lag. The sorts of things that often sell high end TVs for films/TV are often what you don't want if your aim is low input lag.
 
TVs just do not have the kind of grey to grey or low input lag you want.

BenQ/Zowie monitors are the way to go, there's no substitute for the kind of experience you get with those.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
And you cant talk about input lag without knowing the whole chain, not just the TV/monitor

...Okay.

Interestingly current tech is a step back from the old CRT monitors that had zero (0) input lag

CRTs aren't instantaneous. They're constrained by their refresh rate and their raster scan method.

At 60 Hz, a new frame gets sent from the source every 17ms, and then that frame is drawn to the screen line by line, pixel by pixel:

rasterscan.gif


You can take the same lag testing equipment that modern websites use to test LCDs and, through some converters, use them on CRTs as well:

lag_dell_e773c_vga_1080p_hdfury2_xcapture1.png


The tester interacts with the display in two ways:

1) It sends a test pattern to the screen via its HDMI port. (If the screen doesn't have an HDMI input, then you have to chain it through a converter of some kind.)

2) It has a photo-sensor that you place over each of the three white bars displayed on the pattern (top, middle, and bottom). This measures the total time elapsed from (a) when the tester fired off the test pattern over HDMI and (b) when that test pattern is finally drawn on the screen.

Because CRTs (and most flatscreens, actually) draw their picture line by line, you'll observe faster readings on the top and slower readings on the bottom. Because of the 60 Hz refresh rate, the difference between the start and end point of the frame is close to 17ms. The difference between the top and bottom bars of the above snapshot is roughly 14ms, which sounds about right when you take into account overscan and other minutiae. Average the above values together and you'll get 7.9ms, which is the total lag rating that this screen would receive from a website like Displaylag, not 0ms.

And that's the floor for a CRT. In theory, it's possible for future display tech to drop to lower values than this if the refresh rate increases beyond 60 Hz (not only for TVs, but for the consoles and other source devices that feed them the picture in the first place). Can't say that I ever expect it to happen though.
 

Fbh

Member
I'd still write down the model Number and check online.

A lot of TV's have a massive difference when enabling game mode.
My 4K Samsung has borderline unplayable input lag when game mode isnt enabled. But with game mode is like 35 Ms or so which I honestly don't notice.
 

adversarial

Member
I own a Vizio D40U-D1 and personally love it for being 4K and having the lowest input lag of any 4K TV out at the moment. It's by no means the most cutting edge, but for the price and having about only 1 frame of input lag it's a great TV for gaming especially if you play a lot of fighting games.

Does this do HDR?
 

Stranya

Member
I'd still write down the model Number and check online.

A lot of TV's have a massive difference when enabling game mode.
My 4K Samsung has borderline unplayable input lag when game mode is enabled. But with game mode is like 35 Ms or so which I honestly don't notice.
Does the Samsung input label trick still work? (Renaming the relevant HDMI input as PC turns off all post-processing, reducing input lag as much as possible).
 

Teh Lurv

Member
Anything above 4 frames makes precision games like Street Fighter, Megaman or Dodonpachi quite harder or unplayable.

My 6yo HDTV has about 2-3 frames of lag depending on the source's resolution. Running my retro consoles through a video-processor to my TV gives me 4-5 frames of lag overall. The delay is noticeable when I mirror the output to a CRT TV, but I can mentally compensate enough to make Megaman or Donpachi playable. When I tried a different video processor that introduced another 3 frames of lag (7-8 frames), everything that require reflexes became unplayable.
 
Average the above values together and you'll get 7.9ms, which is the total lag rating that this screen would receive from a website like Displaylag, not 0ms.

And that's the floor for a CRT. In theory, it's possible for future display tech to drop to lower values than this if the refresh rate increases beyond 60 Hz (not only for TVs, but for the consoles and other source devices that feed them the picture in the first place). Can't say that I ever expect it to happen though.

Thanks, it has been said to be zero, but you have actually given some effort in explaining this so if it's 8 ms then i'm happy with that
 

Ash735

Member
In general 34ms should be the high point most are willing to go to before it becomes noticeable for most games.
 

Koren

Member
I fail to see how I am 'owned' when that site includes time from things the monitor cannot control, e.g. button press -> rendering -> signal transmission.
?

displaylag use a Leo Bodnar device to test input lag, there's no rendering and definitively no button press involved.

It somehow involve signal reception and screen refresh, but that's a part of input lag for a monitor/TV set...

You mean maximum, right? The minimum is obviously zero.
The "normal" minimum with a 60Hz display isn't 0 but ~8ms. You could go lower in theory, but not down to zero (it will never be, it would mean instantaneous transmission and instantaneous refresh).
 

Koren

Member
Thanks, it has been said to be zero, but you have actually given some effort in explaining this so if it's 8 ms then i'm happy with that
I agree with you, the explanation is great.

As far as the "0 / 8ms", it depends how you view things.


CRT add basically 0 input lag over an analog signal, in any place of the screen (top, middle, bottom).

But really few devices build the picture at the same time as the send it. Even on consoles without framebuffer, you usually "freeze" the state when screen refresh start. So you get ~0ms of lag at the top, and ~16ms at the bottom, as explained, measured using the time the image begins being sent as the refeerence. That's probably the sanest way to measure input lag.

Newer displays can't do much more, and it's already impressive they can get close to it (that means they start refreshing the screen before receiving the whole picture... a 1080p device that has 10ms of input lag basically have only received only 100-150 lines that it still hasn't displayed. That's quite an impressive result.


But you can always measure input lag with respect to CRT input lag. In this case, a perfect screen will be close to 0 (it could even be slightly negative if you take "advanced" CRTs that were holding a couple lines in buffers, for example to improve color rendition)


But the best proof that CRT had lag is this:
1200px-Nintendo-SNES-Super-Scope-L.jpg


Superscope and most other lightguns (NOT the NES zapper) are basically precise input lag testers. You compute the position of the aim by measuring precisely (down to a couple µs, not mere ms) the lag at the point the device is aiming. If you aim a lower point, the lag is higher. If you aim a point to the right, the lag is also (slightly) higher.

Calibration is done so that you could adjust to each display slightly different lag.
 
I saw a TV at best buy today. A 4K+HDR Samsung DEL TV, for 820$CA (on sale from 1000$CA). I asked a clerk and he told me the TV had 60ms of input lag. I honestly don't know if that's too much for video games to be playable or if it wouldn't matter.

I don't play fighting games, but I do play action games like Nioh and Dark Souls. Would a TV with 60ms of input lag be okay or not? If not, what's the minimum input lag a TV should have for video games to be playable and enjoyable? 50ms? 40? How much, exactly?

Can we confirm whether or not this Best Buy sales associate actually had any idea what he was talking about? I'd be shocked if he actually had real knowledge of input lag.
 
My last tv was around 42ms and it made shooters and any game requiring precise timing completely unplayable.

My ex would use our main tv with no issues, but if she used the bedroom TV so we could play online together she would constantly complain about the lag.

From what I hear around 20 is the standard, and I don't know of many sets that ever got much below that aside from a Sony unit a while back that was like 18ms.

Could be wrong though.
 

recursive

Member
Input lag in general is a highly overblown "detriment" of TVs and monitors.
Unless you're a fighting game or fps competitive player it likely won't bother you at all.

Lower is better of course, but for general purpose gaming, I wouldn't let it be a deciding factor on a TV if the picture quality and price is good.

This is really bad advice. Significant input lag decouples the controls from the visual response regardless of genre. There are plenty of options available that offer an excellent picture quality, low input lag, and a reasonable price where it really doesn't make much sense to settle for less.
 
This is all kind of crazy. 100ms is 1/10th of a second, and human reaction times are about 4/10ths of a second if you have really fast reaction times, for most folks it's more.

60ms is not unplayable, no TV has display lag that is in the unplayable range, to answer OP. You'd have to be a very high level player to really notice the difference between 60 and 40, and I almost guarantee if you lined TV's up next to each other and tested folks that the amount of times people could tell the difference would match the law of averages for guessing.

I can easily tell the difference between 25 ms and 65 ms. Controls get noticeably more sluggish. Doesn't matter what kind of game it is. Though, do most people care enough? Doubtful. For me it's unbearable.
 
STGs, competitive fighters, true arcade games ~ 10ms lag or lower

competitive FPS, competitive RTS, known tough games, retro or mini consoles, retro/arcade-based games ~ 20ms lag or lower

general multiplayer ~ 30ms or lower

general single player/everything else ~ 60ms or lower
 
Spaced Harrier: yikes, I had no idea wired sticks were so laggy! D:

Most aren't so Bad.

Converters can fuck things up though.

I played most of My SFIV Career on a CRT at home.
The Irish community invested a lot of time and money though on lag free monitors.

Benq Eventually sponsored us so thats what we have now.


Monitors suck for sound though, at events... always!! :(
 
STGs, competitive fighters, true arcade games ~ 10ms lag or lower

competitive FPS, competitive RTS, known tough games, retro or mini consoles, retro/arcade-based games ~ 20ms lag or lower

general multiplayer ~ 30ms or lower

general single player/everything else ~ 60ms or lower

It doesn't matter what you are playing. 60ms is always terrible.
 

shockdude

Member
...Okay.



CRTs aren't instantaneous. They're constrained by their refresh rate and their raster scan method.

At 60 Hz, a new frame gets sent from the source every 17ms, and then that frame is drawn to the screen line by line, pixel by pixel:

rasterscan.gif


You can take the same lag testing equipment that modern websites use to test LCDs and, through some converters, use them on CRTs as well:

lag_dell_e773c_vga_1080p_hdfury2_xcapture1.png


The tester interacts with the display in two ways:

1) It sends a test pattern to the screen via its HDMI port. (If the screen doesn't have an HDMI input, then you have to chain it through a converter of some kind.)

2) It has a photo-sensor that you place over each of the three white bars displayed on the pattern (top, middle, and bottom). This measures the total time elapsed from (a) when the tester fired off the test pattern over HDMI and (b) when that test pattern is finally drawn on the screen.

Because CRTs (and most flatscreens, actually) draw their picture line by line, you'll observe faster readings on the top and slower readings on the bottom. Because of the 60 Hz refresh rate, the difference between the start and end point of the frame is close to 17ms. The difference between the top and bottom bars of the above snapshot is roughly 14ms, which sounds about right when you take into account overscan and other minutiae. Average the above values together and you'll get 7.9ms, which is the total lag rating that this screen would receive from a website like Displaylag, not 0ms.

And that's the floor for a CRT. In theory, it's possible for future display tech to drop to lower values than this if the refresh rate increases beyond 60 Hz (not only for TVs, but for the consoles and other source devices that feed them the picture in the first place). Can't say that I ever expect it to happen though.
This explanation is fantastic. Well done.
Impressive to see the CRT having <8ms of lag. Seems like it overscans the bottom of the screen a bit.
 

NEO0MJ

Member
Speaking of CRTs, I found an old one that we used to use, our last TV before going with LCD panels. Never noticed that it was 100 Hz. Does that mean it could have played games at 100 FPS?
 

OuiOuiBa

Member
Speaking of CRTs, I found an old one that we used to use, our last TV before going with LCD panels. Never noticed that it was 100 Hz. Does that mean it could have played games at 100 FPS?
I think these are 100Hz TV but not 100fps TV, that is to say they interpolate (just like many 120/240Hz TV, today there exist only a few actual 240fps LCD monitors to choose from).
Consumer CRT monitors could cover a wide range of refresh rates, for instance, the Mitsubishi Diamond Pro 2070SB did 50Hz-160 Hz (160 actual frames per second). The higher the resolution, the lower the maximum refresh rate, the 2070SB went up to 2048x1536@86Hz.
 
Having had a E6 (34 ms) and a B7 (21 ms) here, I noticed the 13 ms difference, but I'm still considering going back to an E6 due to the 3D.

I would not go higher than 40 ms though, so 60 ms is absolutely out of the question.
 

Klotera

Member
The rule of thumb I've heard is to look for sub-40. Lower is obviously better, but most people won't notice it as long as it's within this range.
 
Top Bottom