• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

I just accidentally activated my TV's motion smoother...

I use it on FPS(offline) sometimes, Killzone 3 looked pretty good with it, it also makes the image cleaner in terms of jaggies, of course i'd never use it in an online game but offline it does make some games look and feel smoother.
 
Just bought a Samsung tv. It's 60hz. It's being shipped to me now...will this feature create any noticeable difference to me?

I imagine it'd be useless in games already running at 60fps. But would motion flow get my 30fps games to have "the feel of 60 fps"?
:lol

I know lotta people here are hating on the feature, but honestly, if you are a bit of a frame rate and graphic whore like me, and have a decent PC and like to play your PC games on a big screen TV, (Also do not bother with multiplayer precision/frame lag bs) I highly recommend something 120hz and higher.

If you don't like it, you can always turn it off, and I hate to say it to you, you may be missing out on visually enhanced, pleasant gaming. I would never watch a movie or a TV show with it tuned on, but I always have it one when I am playing games that have say 35-40 and over frame rates, to make them look even more smoother, it's like a "cheap" way of getting more frame rates out of your games.

Seems like most people here will argue that technically your frame rate isn't increasing . I cannot argue with that, but I also cannot argue with my eyes either.

I guess in conclusion, if a 60hz TV is only 100 bucks cheaper than it's 120hz counter part, I highly recommend you pay that extra 100 dollars and get that 120hz TV or even the 240hz ( I have no experience with them) I have an older but a high end 120hz Samsung TV, that handles Motion Plus pretty well, not many cheap (brand) TV's can mind you. If you're going to spend lots of time gaming on your TV, just get it! Do it, Do it! (You will enjoy and notice more of a difference on PC games that have frame rates somewhere between 30 to 60fps than you will on console games locked at 30fps, just so you know)
 
Racing hides input lag well.

Incidentally, that's why Apple has always touted Real Racing 2 when hyping Airplay; the lag is so bad that very, very few genres have any possible use for Airplay; but with racing, you don't even notice.
 
I actually don't mind this with some games. Except when it gets to certain parts, then it becomes kinda noticeable. Like when I'm playing Madden and always miss the max power for a kick off, when normally I hit it on a regular basis on a lower lag monitor.
 
Not sure how any self-respecting gamer could put up with the additional input lag motion processing adds. I find most LCD displays bad enough in game mode, never mind outside of game mode. Also the motion processing isn't perfect and introduces plenty of artefacts.

I'm so glad I went plasma for this first time this year (Panasonic GT50) and now I could never possibly go back to LCD. Yes, I've become one of those plasma people. The motion clarity in gaming is literally insane. Especially 60fps games where there isn't a single ounce of motion blur or ghosting. 60fps games like Joe Danger, Sonic 4, Hell Yeah, Black Ops, Daytona etc all feel like a whole new experience on plasma because of this. So glad I quickly swapped out the clouding-ridden ES8000 I almost wasted my money on in May.

I actually remember being put off by LCD ghosting when I first switched to the technology from CRT years ago. But then I grew used to it and didn't think about it. It's only when I've been in arcades with CRT displays since that I've noticed the old clarity I missed on 60fps. Until this year... when I finally got a plasma! And holy shit I'll never look back. Glad to see that OLED will offer similar (if not better) motion clarity too.

Also games on plasma seem to have fully saturated colours without them losing details within - if that makes any sense. There's just something about the colour that matches CRT - something I could never replicated on any of my LCD and LED displays I've had over the years. Maybe the gamut?

Anyway I've upgraded my TV almost every year for the past 8 years but this is the year when I'm finally completely happy and don't ever want my GT50 to break down. I've got a 5 year warranty with it and I'm hoping back the end of that point OLED will be fully established and the niggles will be sorted.
 
Gamepad controls are good for covering up input lag. I can't even use vsync when I'm using m/kb because of the few extra ms of input lag, but I don't notice motion interpolation lag on a gamepad.

Top down games work perfectly with motion interpolation, for some reason. Everything else is a mixed bag.
 
This smoothness is why PC gamers are so insistent on a 60fps frame rate. Things look excellent when those extra frames are there. The motion enhanced tvs help console games achieve that same look...but mine lags too much. Still, if you can stand it and don't game on a PC, its the next best thing.
 
It makes things look pretty good and pretty much eliminates any blur / compensates for lower FPS, but input lag is greatly increased for me and I can easily notice the glitches that this mode has (mostly noticeable when panning the camera and anything in the background doesn't line up with the rest).
 
... Holy shit. Forza Horizon (Which is locked at 30fps) feels like 60 fps and there is no real increase of input lag notable for me.

I have a toshiba tv with Active Vision 400 mode. (Pseudo 400 hz)

Anyone tried their motion smoother with games? (Motion plus--> Samsung; motion flow --> Sony)

I going to try far cry 3 now but I guess with dynamic fps the stuttering will be increased.

I've found it's really hit or miss for games in terms of creating artifacts and odd moments of janky motion. I generally keep it off, though when I did have it on I didn't notice a high amount of lag. But I only tested it with Forza Horizon and Assassin's Creed 3. A fighting game and a twitch shooter would have made for a better test.
 
I couldn't play Sonic Generations for shit because of the terrible lag, I thought the game was just shit until I turned my Samsung LED/LCD TV's game mode on, which disables all the post processing and "smoothing" rubbish and enabled me to jump in time.
 
At best I always find this effect looks cheap, at worst it adds what I'll artificial throttling. Just all around unpleasant- mind you this is primarily based upon in store demos(where I've quite literally never found that it looks good) and suffering through the 2009 star trek film with it on the entire time.

I would think it would be nearly intolerable in videogames , most led tvs have what 2-3 frames of input lag as it is ? turning this on at least doubles that. Add to that the input lag that's usually prebuilt into most games. 1 frame in a 60 fps game is 16ms, 32ms in a 30 fps game. Control responsiveness/perception will vary from game to game obviously but the lower that number is when all factors are added up the better. I recall people complained about killzone 2 a few years back because it had like 60ms of lag built into your controller inputs so you had to be very deliberate with your actions. Coupled with the average tv you might be seeing a 4 or 5 frame delay from pressing the button on the controller until that action is absorbed by the games logic and displayed on screen. Turn on motion interpolation and you might lose upwards of a 3rd of a second, it doesn't sound like much time but it leads to very sluggish play.
 
This smoothness is why PC gamers are so insistent on a 60fps frame rate. Things look excellent when those extra frames are there. The motion enhanced tvs help console games achieve that same look...but mine lags too much. Still, if you can stand it and don't game on a PC, its the next best thing.
the problem is that while it may look better, it doesn't play any better because those frames aren't really real. some gamers wan't 60fps for more reasons than "it looks good".
 
Other than horrid input lag that I've experienced on many sets, the problem with motion smoothing is the acceleration of objects is incorrect since they only use 2 frames of data to make the inbetween frame, making things look like they are in fast forward to people who are very perceptive (like me).

I had to play Halo CE multiplayer at fanexpo on a set with this thing enabled and it made the game pretty much unplayable for me.
 
to be honest I have little to say on the topic, but on the question of the usage of the word "lag":

network latency = lag
input latency = lag
low framerate != lag

frames drop, they don't lag. pet peeve thy name is internets.
 
I was loving Assassin's Creed III with my Auto Motion Plus on. Then I turned it off. Then I completed the first sequence. It was all downhill from there.

But for those first 3 hours, I was absolutely loving that game. Those cutscenes and that gameplay... so much better.
 
Yup, with the right game it's awesome. I've used it with Skyrim for 300 hours or so, almost makes it feel like a new game.
Wouldn't use it for fast paced games though. Wii pointer games is a big no no too.
 
What is this Soap Opera effect everyone is talking about? Can anyone give an example?

Things that are shot on film are typically shot at 24fps. Soap operas are normally shot on video, at up to 60fps. People who watch a lot of movies get used to the lower frame rate and it's a bit of a shock when they watch something shot at a higher rate, to the point that they perceive it as "fake".
 
Its hit or miss on my TV for games, so I just leave it off.
This... sometimes I do it just because it's pretty. My TV only adds about 50-80ms... some games it's killer, others it's worth at least a test run. Might try it on asscreed 3 Wii U version later just to get a better sense of added lag.
 
Interpolation adds 105ms of extra lag on my TV. (from 40ms without to 145ms with) Unacceptable for games.

Things that are shot on film are typically shot at 24fps. Soap operas are normally shot on video, at up to 60fps. People who watch a lot of movies get used to the lower frame rate and it's a bit of a shock when they watch something shot at a higher rate, to the point that they perceive it as "fake".

Shot at 60fps is fine.
The "fake" part comes because motion interpolation is shit at objects rotating on the x and y axis. It also tries to smooth out shake. (camera and object) Shaking is jerky by definition and when you smooth it it just looks fucking wrong.
 
I personally think that it depends on a lot of variables, like different TVs, different implementations of the tech, and the game being played. If you get a TV that does it well, and play a game that already doesn't have a lot of input lag, then it will be pretty sweet. I remember playing Halo 3 on a motion flow TV or whatever it was called and it was awesome - really did seem like 60fps even with the controller in hand. That said, part of the appeal of TRUE 60fps is reduced input lag, so it's a bit pointless to play something dire like Heavenly Sword with it. But for me, a greater appeal is how smooth it makes everything look, so the tech does pique my interest in a novel way.
 
Shot at 60fps is fine.
The "fake" part comes because motion interpolation is shit at objects rotating on the x and y axis. It also tries to smooth out shake. (camera and object) Shaking is jerky by definition and when you smooth it it just looks fucking wrong.

I was speaking strictly in context of the question about why it was called the "soap opera effect". Some people are so used to low framerate video that they perceive 60fps, even with no interpolation, as fake.
 
Anyone else read the title as "I just accidentally my TV's motion smoother..." ?

It's getting too late here.
 
to be honest I have little to say on the topic, but on the question of the usage of the word "lag":

network latency = lag
input latency = lag
low framerate != lag

frames drop, they don't lag. pet peeve thy name is internets.
Here though, is a very special context: The OP is claiming that he feels the game has improved with the Active Motion tech, which from the technical side, interpolates with the next frame to generate the in-between frame.

He may feel the game looks smoother, despite that the tech implementation means he's adding to the input latency with the motion blending on.
 
I was speaking strictly in context of the question about why it was called the "soap opera effect". Some people are so used to low framerate video that they perceive 60fps, even with no interpolation, as fake.
I think there's something about how motion blur is done in film that makes it even worse than a soap opera. It's like everything in motion is a cardboard cut-out.
 
A few years ago I bought a 60hz Sharp Aquos specifically to avoid this. I like the idea of motion interpolation and making my 30fps games look like 60, but what I've seen of it is never consistent: When I played Uncharted 2 at a friend's house the images on TV jumped between the perceived "60 fps" to 30 all the time, it never stayed consistent. Does this still happen with the latest TV models?
 
I have a Samsung C8000 LED tv from 2010 and turned the blur and judder reduction up to 10 and some games really do look amazing with this feature.

Halo 3 and Halo 4 and other 30fps games look awesome with it. Never had a problem on multiplayer matches.
 
The concept is very cool, but I've yet to see a fully convincing implementation. The problem I've experienced is simply that the TVs are unable to cope with all situations. If animation is too fast or too slow the effect is ruined. It's possible to end up with some parts of the screen moving at 60 fps while others appear to run slower. Racing games do seem to benefit the most as the motion is just the right speed and somewhat predictable. When it starts to break, however, I find it to look dreadful.

Also, it does *NOT* cope well with framerate dips. Forza Horizon works OK simply because it holds a very steady framerate but if the game drops below 30 fps or tears the effect is compromised. It simply isn't stable enough for me in most games, from what I've tested.

There are many different implementations of this feature as well each with their own pros and cons but I've yet to find any that can handle all challenges.
 
I hate the setting, makes games feel very unnatural, but on my TV there's also a very noticable input lag, so playing FPS's, or any game that isn't a point 'n click really, is out of the question.

Also the image quality suffers when things move too fast. It's very noticable around the edges of (semi)stationary objects in front of moving backgrounds (like your main character, or mini-map).

It can be great though, games like the Walking Dead don't require fast button input anyway and the blur reduction makes moving images seem very clear and not, well, blurry.
 
Holy shit. A second or two?

Well if the game is already pretty laggy you would only be making it worse by turning on interpolation. Plus your display's normal latency. Pretty sure it's along the lines of:

Controller signal to Console time + game's input processing time + time to render frame which reflects input change + time to display that frame on your TV.

When you turn on interpolation, you'll be adding to that display latency. Each game varies with input processing and rendering time as well.
 
But the Internet told me you're not supposed to like that feature.
I've never used it for games, but I can't stand it for regular TV and movies. Some people I know have motion smoothing on and have subsequently gotten used to it, but I still think it's blasphemy.
 
... Holy shit. Forza Horizon (Which is locked at 30fps) feels like 60 fps and there is no real increase of input lag notable for me.

I have a toshiba tv with Active Vision 400 mode. (Pseudo 400 hz)

Anyone tried their motion smoother with games? (Motion plus--> Samsung; motion flow --> Sony)

I going to try far cry 3 now but I guess with dynamic fps the stuttering will be increased.

Same, tried it once with AC:B on PS3. The difference is like night and day.
 
I've never used it for games, but I can't stand it for regular TV and movies. Some people I know have motion smoothing on and have subsequently gotten used to it, but I still think it's blasphemy.
My new Samsung HDTV came with motion smoothing enabled by default. I think it's a neat feature but I also vastly prefer having Motion Plus set to "clear."

Everything just looks unnatural with motion smoothing. I haven't tried it with games though. My TV also has an option called "game mode." Anyone have any experiences with that?
 
My new Samsung HDTV came with motion smoothing enabled by default. I think it's a neat feature but I also vastly prefer having Motion Plus set to "clear."

Everything just looks unnatural with motion smoothing. I haven't tried it with games though. My TV also has an option called "game mode." Anyone have any experiences with that?

Game Mode disables most, if not all, postprocessing that causes lag. On recent Samsung TVs, Game Mode is a per-input setting that has its own contrast/brightness/etc settings, stored separately from the ones used with Game Mode off. This will also disable Motion Plus (by far the worst offender in introducing input lag) entirely. On last year's 3D models, Game Mode is also the only way to disable Motion Plus when viewing 3D content.

Motion Plus's clear mode has its uses if you're watching movies shot at 24fps, but otherwise it's a completely unneeded feature.
 
The feel of 60 frames per second.
I've seen this phrase a few times in the last few days and its starting to weird me out, I don't read gaming much, I feel like I'm new here all of the sudden.
 
My tv set gives me some major input lag when I put it on 120hz. Sometimes it's hardly noticeable but in games like rock band it makes them unplayable.
 
I've seen this phrase a few times in the last few days and its starting to weird me out, I don't read gaming much, I feel like I'm new here all of the sudden.
Ninja Theory reference, they're making DmC 30fps but promise "the feel of 60fps" (which is preposterous marketing talk, of course).
 
Top Bottom