Hollywood Directors team up against the scourge of TV Motion Smoothing

Should TV manufacturers make a filter that adds chromatic aberration to your TV?


Results are only viewable after voting.
Hey man! It works great for video gaming on your TV. I get it that it shouldn't be set up as default. But it got me through lots of Pc gaming goodness, maling it seem like some games were a lot smoother in frame rate than they actually were.
 
The fact that it’s hidden deep in menus and called something different on every TV AND is on by default is the biggest bullshit. Yeah I’m totally gonna drop a few thousand on a TV because I want to make everything look like it was filmed in some British dude’s basement in 1973.
This sums it up. Always the first feature I disable.
 
Everytime I went to a friend's house to watch Game of Thrones I hated it his Vizio TV had this shit cranked to the max, ruins everything.
 
My mom got a new 4K 65" X930D Sony TV at the beginning of the year, and with motion smoothing entirely turned off, jerkiness was distractingly bad. Fortunately, it has custom adjustments, so I enabled it with the minimum amount of smoothing, and it addresses the 90% motion jerkiness without making motion look soap opera fake and overly smooth. So, I think that there are improvements that have been made in the tech that don't have to make everything look like butt. My ~4 year old Sony W900a looks terrible with it turned on, so I keep it off.

I hate when I walk into a bar or a restaurant and they've got their TVs with motion interpolation set to maximum soap opera. I just want to take their remote and fix it.
 
One of the problems with LCD displays is that motion effectively reduce the resolution to 300p. So yep 1080p in stills or slow panning. But as soon as the action get heated the details fly out of the window.
That's caused by displays being flicker-free, not because they are LCDs. OLEDs have exactly the same problem.
It's also much worse than that. Those motion tests they use - which are a test of slow panning motion, not fast motion - run at 60Hz.
On a flicker-free display, motion resolution is linked directly to framerate. So it's not dropping to "300 lines of motion resolution" with 24 FPS film - it's dropping to 120 lines.

I'm all for native HFR. That's not what this is. This is like colorization of black and white material. It's not not good looking.
Nope. Film looks like that natively. The problem is that pretty much no-one has been watching film natively at 24Hz for decades.
Interpolation is closer to an accurate representation of the source material than repeating frames.

Interpolation is gross. Sample-and-hold displays are insufficient for displaying full motion.
Y'all should've bought plasmas while you could. Too bad the early stigma stuck, FUD spread like wildfire, and the average consumer continually buys into half-baked gimmickry.
I'll take motion blur over PWM driving. Plasmas give me migraines due to the problems that they have in motion, like the image breaking up into separate colors. Black and white films in particular are unwatchable.
Warning: lots of flicker. This slow-motion video shows how they actually draw the image.
Note: the refresh rates listed there are wrong. 14 sub-fields per frame is 840Hz (not 1400Hz) and 28 sub-fields per frame is 1680Hz (not 2800Hz).
For good motion handling, refresh rate must be equal to the framerate.
 
You know, I've always hated the look but just learned the name thanks to this thread. Fortunately my TV doesn't have it, but next time I visit a few certain family members, I'll do them a quick favor.
 
The high frame rate would be nice for some things... but it is HORRIBLE for everything due to the damn artifacts. Sports probably doesnt suffer if people dont mind it... unless they are used to seeing players look like they came out of Ringu...

Kill. It. With. Fire.

At least HFR will never be a thing in cinemas again...?
 
This was the first thing I turned off when I got a TV a few years ago. It has been an issue for awhile and I always roll my eyes at the sight of someone with it on.
 
I'll take motion blur over PWM driving. Plasmas give me migraines due to the problems that they have in motion, like the image breaking up into separate colors. Black and white films in particular are unwatchable.
Sorry that you fall into the minority and seem to have major issues with flickering. As someone who also suffers from migraines I totally get it. I left a job because they changed to different lighting that was a major trigger and have viewed some plasmas that have had a similar affect on me. However, not all PDPs were created equal and I've owned late model Pioneers/Panasonics for years without having issues.

Blur, however, continues to be a persistent issue with every LCD/OLED. And I'm a home theater installer so you name it, and I'm eyes-on with it on a regular basis.
 
It's gotten better over the years but any of the "high" settings just look awful in the real world (not on TV specific engineered demo's). I don't know how the QA departments look at a TV with motion smoothing set to high and think to themselves "the motion on this looks amazing".
 
Motion interpolation is one thing, but I'm not at all in agreement with the High Frame Rate haters.

I, for one, can't wait until directors get on board with 120 Hz film making.
 
The fact that it's called something different by every tv manufacturer is bullshit. Y'all should do what I do and just turn off every post processing feature on a TV.
 
The high frame rate would be nice for some things... but it is HORRIBLE for everything due to the damn artifacts. Sports probably doesnt suffer if people dont mind it... unless they are used to seeing players look like they came out of Ringu...

Kill. It. With. Fire.

At least HFR will never be a thing in cinemas again...?
The Avatar sequels are being shot at 48fps HFR, last we heard.
 
Is there a reason games don't use this to achieve at least the look of 60 FPS?
There's been discussion of games implementing this internally to achieve that effect with minimal input lag, but nobody's actually done it yet. It has some things in common with temporal reconstruction techniques used for AA, just with the added element of interpolating new frames instead of just smoothing out temporal artifacts in existing ones.
 
I lost this argument for years at friends houses and just gave up.

"It looks more real"

Is there a reason games don't use this to achieve at least the look of 60 FPS?
Input lag is the main reason which is really the opposite of what you want when trying to achieve 60fps responsiveness. Ends up working against your goal.
 
Sorry that you fall into the minority and seem to have major issues with flickering. As someone who also suffers from migraines I totally get it. I left a job because they changed to different lighting that was a major trigger and have viewed some plasmas that have had a similar affect on me. However, not all PDPs were created equal and I've owned late model Pioneers/Panasonics for years without having issues.
Yeah, it sucks. I have issues with a lot of LED lighting too - especially cheap LED lighting, which tends to use PWM dimming.
It's not an issue with any specific model of plasma though - it's how they fundamentally draw an image that is the problem.
Even if the phosphors all had identical response times, which would solve the color breakup problem, they're still drawing the image using sub-frames instead of refreshing once per frame.

Blur, however, continues to be a persistent issue with every LCD/OLED. And I'm a home theater installer so you name it, and I'm eyes-on with it on a regular basis.
Several LCDs and even some OLEDs now have backlight scanning or BFI options which greatly reduces motion blur if it's refreshing at the same rate as the framerate.
The main problem arises when the refresh rate is higher than the source framerate, since you get clear double-images in place of the motion blur.
With film only being 24 FPS, and manufacturers refusing to strobe at a rate lower than 60Hz, the only solution is interpolation, or some combination of interpolation plus strobing.

Aren't most movies on TV using 3:2 pulldown or something similar?
I'd rather fake fluid that jerky panning shots.
3:2 pulldown means that they extract the original 24 FPS frames from a 60Hz source.
Five 1080i60 fields gives you two 1080p24 frames with 3:2 pulldown - or even 480i60 to 480p24 with DVDs in certain players.
This eliminates the judder caused by an uneven 3:2 cadence from fitting 24 frames into 60 refreshes, but does nothing for the jerky appearance of low framerate motion on a flicker-free display.

The Avatar sequels are being shot at 48fps HFR, last we heard.
Filmmakers really need to drop the idea that there is something magical about the number 24.
If it was shot at 60 FPS, it could display in HFR on virtually every television ever produced. (at least in NTSC regions)
Almost nothing supports 48Hz inputs.

If they are going to pick a framerate that nothing currently supports, they should have gone with something like 120 FPS.
Push for something even higher than that and keep theaters one step ahead of consumer displays, even.

I lost this argument for years at friends houses and just gave up.
"It looks more real"
They're not wrong.
24 FPS looks very unnatural the way that 99% of displays out there show it.
It's just that cinephiles somehow like that awful stuttering motion and the screen becoming a blur if anything moves.
 

MattKeil

BIGTIME TV MOGUL #2
This is the unfortunate side effect of the soap operas of the 80s and 90s - anything that looks high frame rate or smooth has embedded the association of "cheap soap opera" into the general public, and this turned a lot of people off the technical aspects of The Hobbit.

People have this ingrained idea that it's only a "film" if it's 24fps. It "feels filmic" or "has that filmic look". And rightfully so - the 24fps standard has been around for decades and has been used for almost every movie they've watched.

But 24fps is only a number that was chosen because it was the tradeoff between motion fidelity and cost. It was good enough and cheap enough to be widely adopted. Unfortunately it's also terribly limiting in some cases, but attempting to go against the grain is often met with revulsion from film purists.

We need more big budget, high frame rate content, especially for action and sci-fi films where temporal fidelity is important. This will help change the perception and make it more acceptable to have high frame rate content for films that would benefit from it. Can you imagine Fast & Furious or Star Trek shot natively at 60fps. It would be a feast for the eyes.
It would look like garbage and it can fuck off. HFR does not do anything but make film look like low budget crap. 60fps is for videogames, not movies.

You HFR people will never win this fight, thankfully. Nobody can afford to triple their CG budgets.
 
My parents bought a new massive 4k tv and when I went to visit they were watching a stretched SD channel with motion smoothing on. I wanted to gouge my eyes out with a rusty spoon.
 
If your're dumb enough to leave this "feature" on when you newly purchase TV home and leave it on while watching movies: Maybe you don't deserve to buy a new TV etc.

What I'm getting at is, it's a feature that should not go away. It should be there as an option. Because like I said, I think it works great for (most) gaming.
 
I've always wondered what this feature what but could never put my finger on it
The only place I've seen this feature is at supermarkets and at some of my family's friends' houses since it never came by default on any of the TVs I've owned
 
So what is motion smoothing? What does it look like?
More like real life instead of being a slide show. Unfortunately, we've been brainwashed to think that stuttery, stilted framerates are a mark of high quality photography.

To be fair, motion smoothing is artificially trying to achieve higher frame rates after the fact by frame doubling and adding interpolated frames (akin to anti-aliasing in the temporal direction).
 
I'll notice this on other people's TVs when I'm at their house and ask them if they want me to turn it off and they'll ask me what I'm talking about.

Might be too late.

See it on all the 4K TVs whenever I go to Best Buy.

Shame.