• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Why is it only 30 or 60fps, no 45?

That's a lie.

The input lag is horrendously bad.

When I activate Motionflow, I have over 100ms of input lag on my Samsung. HDTVTest is a good source of TV reviews with details of input lag. Sure, the image looks better, but it causes your game to feel like shit.

The input lag isn't horrendously bad (though, it must come down to the TV). For one, you can barely see/feel any difference at all when you're playing a 60fps game. Give Rage a try and switch between. The input lag varies by game. BF3 on 360 isn't off too much to cause problems, but Gears Judgment felt a tad off and felt "floaty." Maybe it's the input lag that's there in a game without the TV adding more that makes certain games feel better or worse.
 
Your screen, most likely, refreshes every 1/60th of a second. If you provide it 60 frames of video, each frame lasts 1/60th of a second. If you provide it 30 frames of video each frame lasts 2/60ths of a second. If you provide it 45 frames of video, 30 of them will last 1/60th of a second, while 15 of them will last 2/60ths of a second (typically).

Having some frames last longer than others makes the video stutter slightly and it won't look very smooth.

And we're done here.
 
^ GoW came up, sacrifices smoothness for responsiveness well imo. Something about tripple buffering magic that might be used more next-gen too...not very clear on it though. And that post was greatly informative too of course, not yours though :P

At framerates below 40fps I play with a controller since mouse feels awful at those response times, so it isn't as much of an issue. How much of an impact triple buffering causes seems to depend on the game, if it causes significant problems I used adaptive v-sync to lock at 30 or 60.

It's something that requires a bit of testing on your end to get exact. I'm so sensitive to variable framerates that it's essential to me (I can spot even 2-3 frame variance) but how you need to tailor it will be specific to your sesitivity and your desired graphic to performance target.

Edit: For example, I'm currently playing Borderlands 2 at a locked 50fps maxed out to avoid the framerate fluctuating between 50 and 60. I could turn down some settings to run at a locked 60, but 50 is fine by me and I love the PhysX effect. To me, the only performance statistic that matters is the minimum framerate.

In the last few years I found myself using the 360 controller way more than PC for that reason. I used both with BF3, although that was probably more for the jets.

50 is usually pretty good for me too, all that PAL exposure I think.


V-Sync induces a lot of lag.

I don't really notice it with triple buffering. I think TB is worth it to eliminate tearing.


I agree. Anything higher than 30FPS a bonus to me. PC games have always felt more responsive than their console counterparts thanks to the slightly higher framerates, even if I may not always get a perfect 60.

Hmm will try it out more when I upgrade my PC, thanks. Although I am hoping that next-gen games use it more too for more responsive games, trying to hold out for the next CPU that destroys everything before it but don't think I'll last.
 
The input lag isn't horrendously bad (though, it must come down to the TV). For one, you can barely see/feel any difference at all when you're playing a 60fps game. Give Rage a try and switch between. The input lag varies by game. BF3 on 360 isn't off too much to cause problems, but Gears Judgment felt a tad off and felt "floaty."

Amount of fps is irrelevant, as MotionPlus crap essentially just buffers each image at its refresh rate. If it's a 120Hz TV, it'll poll the frame every 1/60th of a second (60Hz). The input lag will be with perfect, infinite computing (sounds like a job for the cloud!) at least one frame, since it holds back the first frame it polls, then gets a second. Then it interpolates the two by seeing what changes between the two images and then assuming very much on that movement. Things like crosshairs and HUDs will be a problem, since they're a non-moving factor on a moving background, and the TV is like "I'm assuming the HUD will move with everything else".

So yeah, 16,6ms lag MINIMUM. Notwithstanding the computation that has to be done on the two images it's holding back to create the interpolation. I'm guessing minimum extra lag created by this is in best case 50ms, maybe 100ms is typical. I have one PC screen that has about 8ms lag and one that has about 50. When I used to use the 50ms one, I thought it was perfectly fine. Now I'm used to my new screen, and when I move the mouse over to the other, it's like "holy crap, my mouse is entrapped in syrup". You can get used to the feel of input lag, but it doesn't mean you've basically tripled your reaction time in whatever game you're playing. Your normal reaction time is basically 25ms. Throw an extra 50ms on that, and you're up to 75ms instead. You're just putting yourself at triple the disadvantage most other players have.

All in all, never ever ever use MotionPlus for anything game related. If you enjoy artifacts and interpolating movies, go nuts, and I'll shake my head while you're not looking, but at least you're just ruining the movie, not creating input lag for yourself.

Really, just never use it.

EDIT: Oh, and turn off all your other 'auto' crap too. If you use shit like dynamic contrast and adaptive coloring our whatever all that shit is called, your TV is holding back everything even more. Turn it off. Right now.
 
The input lag isn't horrendously bad (though, it must come down to the TV). For one, you can barely see/feel any difference at all when you're playing a 60fps game. Give Rage a try and switch between. The input lag varies by game. BF3 on 360 isn't off too much to cause problems, but Gears Judgment felt a tad off and felt "floaty." Maybe it's the input lag that's there in a game without the TV adding more that makes certain games feel better or worse.
I can. Tried it with Call Of Duty and it makes a big difference when playing multiplayer.

With game mode, I'm already at a disadvantage with 40ms of input lag on my LCD.

When I play on my plasma, the difference in responsiveness is like day and night, which doesn't have much lag at all.

I've tried using Motionflow. And it's horrible.
 
I have wondered why we can't have a 45fps compromise. Playing games on the PC, 45fps feels more smooth than 30fps, than it feels less smooth that 60fps. If that makes any sense.
 
I have wondered why we can't have a 45fps compromise. Playing games on the PC, 45fps feels more smooth than 30fps, than it feels less smooth that 60fps. If that makes any sense.

Either you'll show 1/3 of the frames for twice as long as the rest, or you'll have screen tearing 1/3 of the time you're playing. Neither is good.
 
I have wondered why we can't have a 45fps compromise. Playing games on the PC, 45fps feels more smooth than 30fps, than it feels less smooth that 60fps. If that makes any sense.

Someone already stated why this doesn't work earlier in the thread

Your screen, most likely, refreshes every 1/60th of a second. If you provide it 60 frames of video, each frame lasts 1/60th of a second. If you provide it 30 frames of video each frame lasts 2/60ths of a second. If you provide it 45 frames of video, 30 of them will last 1/60th of a second, while 15 of them will last 2/60ths of a second (typically).

Having some frames last longer than others makes the video stutter slightly and it won't look very smooth.

Basically, you want the game's framerate to be in sync with your screen that most likely has a frequency of 60 Hz.
 
I suppose it depends on the context of what you are discussing, for broadcast and media distribution yes. (And for TVs that were meant to display said standards) for analogue monitors though refresh rate was largely unhindered before digital displays came into the picture.

This was just a function of technology and its age. The driver behind higher refresh rates for CRTs was largely because low refresh rates caused headaches. This wasn't an issue with LCDs so companies didn't have the same impetus to ramp it up.

Also there was a coincidental mass adoption of audio/video media happening in the computer space with the proliferation of VCD, DV, DVD, webcams etc, all operating in the 60hz TV/Video standards realm.

These in combo meant the DVI standard budgeted just enough bandwidth to drive a digital screen at 60z when running in high resolutions, which at uncompressed rates was still an amazing technical achievement for the time.

The issue is there was no mass demand for increasing the rate and there was (and still is) too many legacy standards to deal with to remove the idea of a fixed refresh rate altogether.
 
Top Bottom