• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

120hz Movies: How can people watch this shit?

Status
Not open for further replies.

Danoss

Member
Glad I clicked on this thread.

I was at a family members house and the movie they were watching look really weird. It was like the movie had been sped up, but the audio was fine. It was doing my head in trying to figure out what it was, now I know.
 

StoOgE

First tragedy, then farce.
optimiss said:
You could probably make that argument for some people, but I like to try to embrace new technologies so I gave it a chance. Turns out it's only weird because it is new. Once it was no longer new, it ceased to be weird, and I started to enjoy the sense of realism that it creates.

It isn't weird because it is new. It is weird because it is an algorithm that creates images that aren't in the original image. Directors spend months getting their movie just the way they want it in post production, and then these TVs shit all over that by creating new frames of animation the director had nothing to do with.

I am all for James Cameron pushing films to 60 or 72 FPS that he is trying to push. More FPS = clearer cleaner motion and less judder in fast moving scenes. I get that. However, using a computer to fake it isn't the way to do it. The film wasn't shot with that kind of smooth motion in mind. Good directors work within the medium and they took into account that the film would be shot at 24FPS when they were framing the shots, action and camera sweeps.

Let Cameron come out with Avatar 2 in 3d and 72 FPS and everyone will shit themselves about how crazy good the 3d looks and how great it is in motion. Until then, stop trying to force movies to be something they weren't intended by the director.

It's like going up to the Mona Lisa and touching it up with neon colors because you think it looks better with less muted tones.

It also doesn't help that the vast majority of the frame interpolation techniques look like shit, especially when dealing with CG integrated into a live action scene. Makes the CG look really obvious 99% of the time.
 

Sirius

Member
Makes me all warm and fuzzy inside knowing that our generation has become accustomed to the 48Hz / 72Hz shown in cinema theatres.

Cannot have it any other way for movies - but for sports the fluidity of frame transition is essential.
 

Ether_Snake

安安安安安安安安安安安安安安安
Good for anything that is "live" or "live-like". Bad for movies.
 

DonMigs85

Member
Sirius said:
Makes me all warm and fuzzy inside knowing that our generation has become accustomed to the 48Hz / 72Hz shown in cinema theatres.

Cannot have it any other way for movies - but for sports the fluidity of frame transition is essential.
Of course the best case scenario is to film sports in 60FPS already and use a display with a low response time for minimal motion/panning blur.
 

optimiss

Junior Member
StoOgE said:
It isn't weird because it is new. It is weird because it is an algorithm that creates images that aren't in the original image. Directors spend months getting their movie just the way they want it in post production, and then these TVs shit all over that by creating new frames of animation the director had nothing to do with.

I am all for James Cameron pushing films to 60 or 72 FPS that he is trying to push. More FPS = clearer cleaner motion and less judder in fast moving scenes. I get that. However, using a computer to fake it isn't the way to do it. The film wasn't shot with that kind of smooth motion in mind. Good directors work within the medium and they took into account that the film would be shot at 24FPS when they were framing the shots, action and camera sweeps.

Let Cameron come out with Avatar 2 in 3d and 72 FPS and everyone will shit themselves about how crazy good the 3d looks and how great it is in motion. Until then, stop trying to force movies to be something they weren't intended by the director.

It's like going up to the Mona Lisa and touching it up with neon colors because you think it looks better with less muted tones.

It also doesn't help that the vast majority of the frame interpolation techniques look like shit, especially when dealing with CG integrated into a live action scene. Makes the CG look really obvious 99% of the time.

You don't have to like it, but I do.
 

reKon

Banned
lol at instant hate within the first like 10 posts. I agree though, 120 hz does look weird. Like people's movements are sped up and more smooth.
 

Raistlin

Post Count: 9999
MattKeil said:
Absurd. Film is 24fps, and should be viewed as such.

Curiously, many directors and cinematographers are advocating moving up to at least 48hz, if not higher.





DonMigs85 said:
It has no place in games either. It wasn't intended by the developers, and it can increase input lag as well.

It's not that it can ... it does, by its very definition.
 

StoOgE

First tragedy, then farce.
Sirius said:
Makes me all warm and fuzzy inside knowing that our generation has become accustomed to the 48Hz / 72Hz shown in cinema theatres.

Cannot have it any other way for movies - but for sports the fluidity of frame transition is essential.

James Cameron is pushing *hard* for 60 or 72fps movies which is plenty for fluid animation. He wanted to do it for Avatar, but the studio told him the 3d push was a big enough technology push all on it's own.

I'm almost positive he is going to make a strong push for Avatar 2 to use a higher frame count (though he may stop at 48fps). Once he does it and people marvel at the smooth motion and how much easier it is to see the 3d in quick scenes I'm sure the other studios will fall in line.

Of course, SFX costs and 3d animated films will become much more expensive to animate, but I assume render farms could use some higher end frame interpolation for some of the 3d effects (hell, they are probably doing this already).
 

Raistlin

Post Count: 9999
MrPliskin said:
There is a certain 'fantasy' nature that comes with shooting at 24fps. It gives a silky smooth vision with blur and conveys motion considerably better than high frame rates.

lolwut?

The judder on pans is nauseating :lol
 

DonMigs85

Member
If they go with 48FPS or higher though, won't the film reels for traditional cinemas double in size too? Best to wait till digital cinemas are the norm IMO.
 
MisterHero said:
SciFi Channel used to air Twilight Zone episodes that way and I thought it was neat because the people looked like they were live and putting on a production

In a strange way it gave me a better sense about the people in that time
That's actually the episodes of Twilight Zone shot on video. The majority of the show was shot on film but some episodes were shot on video to save money. It's pretty jarring when you watch two or three "film" episodes then get the random "video" one.
 

Raistlin

Post Count: 9999
Sirius said:
Makes me all warm and fuzzy inside knowing that our generation has become accustomed to the 48Hz / 72Hz shown in cinema theatres.

What theaters are you going to? It's very rare to film content at that frequency.
 

Slavik81

Member
Raistlin said:
Curiously, many directors and cinematographers are advocating moving up to at least 48hz, if not higher.
Is there any reason why they stick with multiples of 24? It would be nice to get everyone on the same page.

Variable-frequency displays could also solve that problem... Is there a technical reason why all displays I've seen recently have a fixed refresh rate?
 

StoOgE

First tragedy, then farce.
Raistlin said:
lolwut?

The judder on pans is nauseating :lol

A well filmed movie won't really have the problem.

Telecine judder is usually introduced by the display device rather than the source. If your TV fails at 2:3 pulldown (and even some TV's with 3:3, 5:5, 10:10 fail to resolve the scenes properly) will introduce crazy judder.

I remember the Panasonics that came out against the Kuro 9Gs two years ago really failed in 24p mode despite having 3:3 pulldown. A lot of TVs have shit processors despite a proper refresh rate for 24p content.

I'm not saying animation won't be considerably smoother, but judder on pans is typically a processing problem and not with the film.
 

Raistlin

Post Count: 9999
DonMigs85 said:
If they go with 48FPS or higher though, won't the film reels for traditional cinemas double in size too? Best to wait till digital cinemas are the norm IMO.

Actually film is already displayed at 48fps. They display each frame twice in order to reduce flicker.
 

StoOgE

First tragedy, then farce.
DonMigs85 said:
If they go with 48FPS or higher though, won't the film reels for traditional cinemas double in size too? Best to wait till digital cinemas are the norm IMO.

I think that is part of the issue. Digital cinemas could probably be converted to handle it even if they required an upgrade to do so. Reel to reel would be completely fucked... double speed and double sized reels. Would cost considerably more to run and a lot of the motors could have a fuck of a time spinning that much film that fast.

Luckily, I think the 3d adoption of cinemas is going to take off and those should all be digital at this point.

Raistlin said:
Actually film is already displayed at 48fps. They display each frame twice in order to reduce flicker.

Right, but with 3d films you have to essentially double that so the 48fps works because they can but the "B" real in the uneven frames instead of a repeated frame. Going to 48fps would mean needing double that for 3d.

Though, this is probably academic because I think almost all (if not 100%) of the new 3d installs are going to be digital. I left the theater business before the 3d craze kicked off, so I may be talking out of my ass a bit here.

Interestingly, that would essentially fuck the current 3d TVs that only refresh at 120hz since that is no longer a multiple of 48x2 :lol
 
I prefer the motioned enhanced stuff. I have a Sharp Aquos and just checked the options and apparently I've had it on since I got the TV about a year ago. I guess you get used to it because it looks great to me. Doesn't look strange at all now that I'm used to it and it must not be that visually jarring because I never noticed it before. Then again, it might just be that Sharp's tech is less noticeable or something.
 

Raistlin

Post Count: 9999
StoOgE said:
A well filmed movie won't really have the problem.

If by well-filmed you mean they run a math algorithm to generate the bell curve of what pan speeds to avoid ... okay :p

Beyond that, fast motion is certainly problematic as well - the image is reduced to a blurry mess. The only way to avoid that is to simply not have fast motion at all, which is obviously unrealistic.

Telecine judder is usually introduced by the display device rather than the source. If your TV fails at 2:3 pulldown (and even some TV's with 3:3, 5:5, 10:10 fail to resolve the scenes properly) will introduce crazy judder.

Sorry, I was talking about regular judder (sometimes referred to as film judder iirc), not telecine judder.

I remember the Panasonics that came out against the Kuro 9Gs two years ago really failed in 24p mode despite having 3:3 pulldown. A lot of TVs have shit processors despite a proper refresh rate for 24p content.

I'm not saying animation won't be considerably smoother, but judder on pans is typically a processing problem and not with the film.

I doing have a TV with problematic pulldown though, and judder is noticeable ... just as it is in movie theaters.
 

Mashing

Member
I was wondering why my dad's new TV looked like a soap opera. Thanks to this thread I went in and turned this shit off and now it looks normal again.
 

StoOgE

First tragedy, then farce.
Raistlin said:
Beyond that, fast motion is certainly problematic as well - the image is reduced to a blurry mess. The only way to avoid that is to simply not have fast motion at all, which is obviously unrealistic.

I've never really noticed an issue with slow pans having judder. Maybe it's slight enough that I just don't pick up on it. Telecine judder bugs me to no end though. I don't have it with my Kuro, but most of my friends TVs have it like crazy and it makes me want to throw things. One of my buddies had a TV with a "film" mode that he wasn't using, and he had disabled 24p content on his bluray player. I fixed them both and was immediately happy, he still can't tell the different but won't change it lest I yell at him next time I'm over.

Quick action looking like shit is a real problem.. and it doesn't help that quick cuts are the new thing.. you brain already has a lot of problems processing the crazy blurry mess happening. Zooming in and cutting to different cameras every 5 seconds doesn't help.

Though it isn't at bad on a smaller screen at home.

But yeah, bring on 48/72 FPS movies. Just, natively shot that way. And I do prefer they keep them at a multiple of 24FPS to maintain a "film" look.
 

Raistlin

Post Count: 9999
StoOgE said:
Right, but with 3d films you have to essentially double that so the 48fps works because they can but the "B" real in the uneven frames instead of a repeated frame. Going to 48fps would mean needing double that for 3d.

Though, this is probably academic because I think almost all (if not 100%) of the new 3d installs are going to be digital. I left the theater business before the 3d craze kicked off, so I may be talking out of my ass a bit here.

Interestingly, that would essentially fuck the current 3d TVs that only refresh at 120hz since that is no longer a multiple of 48x2 :lol

Actually, basically everything would be fucked - not just TV's :p While BD can output 1080p60 in theory, you'd be putting a lot of demands on bit-rate, not to mention capacity would be fucked.

Expect VioletRay Disc (lol VD) within a decade :D




I should note regarding the 48x2 issue, if search worked ... you'd see I've been advocating that all TV's should support 120 and 144Hz. 72x2 BITCHES!!!!! ;-)
 

Slavik81

Member
StoOgE said:
But yeah, bring on 48/72 FPS movies. Just, natively shot that way. And I do prefer they keep them at a multiple of 24FPS to maintain a "film" look.
Screw it and skip to 120Hz native for all content. Let's go, technology!
 

DonMigs85

Member
Slavik81 said:
Screw it and skip to 120Hz native for all content. Let's go, technology!
that would require a ton of memory to store, plus anything above 60-72Hz looks perfectly smooth to our eyes.
 

StoOgE

First tragedy, then farce.
Raistlin said:
Actually, basically everything would be fucked - not just TV's :p While BD can output 1080p60 in theory, you'd be putting a lot of demands on bit-rate, not to mention capacity would be fucked.

Well, BRD can be increased in capacity by spec, but it would fuck all of the current players (though I'm sure Sony would shoehorn the PS3 in there somehow).

It really would have been nice if 5 years ago everyone sat down and said "alright, this is where we want to be in 10 years time" so we could get all of this sorted out in the first place :lol
 

Raistlin

Post Count: 9999
StoOgE said:
I've never really noticed an issue with slow pans having judder. Maybe it's slight enough that I just don't pick up on it. Telecine judder bugs me to no end though. I don't have it with my Kuro, but most of my friends TVs have it like crazy and it makes me want to throw things. One of my buddies had a TV with a "film" mode that he wasn't using, and he had disabled 24p content on his bluray player. I fixed them both and was immediately happy, he still can't tell the different but won't change it lest I yell at him next time I'm over.

I'm pretty sensitive to framerates - I suppose that's why I'm a videophile.

judder vs telecine judder is a mixed bag. I too would generally take regular judder, it's basically the lesser of two evils except at certain pan speeds that make it REALLY stand out. telecine on the other hand has the advantage of actually having a smaller 'step', but obviously the fact it's irregular makes it stand out quite noticeably.

Probably 90% of the time I watch TV and DVD's at 24Hz. Hooray for video processing :)
 

msdstc

Incredibly Naive
The house I watch true blood at has this on... it looks so tacky, almost like a home video camera. I hope this is a trend that never catches on.
 

DonMigs85

Member
StoOgE said:
Well, BRD can be increased in capacity by spec, but it would fuck all of the current players (though I'm sure Sony would shoehorn the PS3 in there somehow).

It really would have been nice if 5 years ago everyone sat down and said "alright, this is where we want to be in 10 years time" so we could get all of this sorted out in the first place :lol
Yeah it's insane. To think we lived with the NTSC standard for over 50 years, and now they keep shoving new technologies in our face.
 

-PXG-

Member
LCfiner said:
I puke a little bit inside my heart whenever I see motion flow demos in electronics stores.



I hate that shit. haaaaaate it.

Yeah, my buddy got a new 58" Samsung. It's awkward as hell.
 

Raistlin

Post Count: 9999
StoOgE said:
Well, BRD can be increased in capacity by spec, but it would fuck all of the current players (though I'm sure Sony would shoehorn the PS3 in there somehow).

'Therein lies the rub'

As much as they can, I suspect they want to avoid such situations. Something like 3D is okay, since the discs will play 2D as well IIRC.

Interestingly, they may try to shoe-horn 36GB layers into the spec. Signs point to it likely working on the majority of players. It'll be interesting to see how that goes.


It really would have been nice if 5 years ago everyone sat down and said "alright, this is where we want to be in 10 years time" so we could get all of this sorted out in the first place :lol

I'm quite sure they actually did have an idea where they wanted to be. The problem is that if they waited to actually create said tech ... it would be 10 before they launched :p

This stuff is pretty bleeding edge, and they really don't seem to be holding back much. I suspect what launched was pretty close to the max of what they could produce. Crap, they couldn't even produce it originally :p The blue diode yields were ATROCIOUS early on.


What concerns me about 3D-BD though, it there has been no word on whether they upped the transfer rate. The BD movie spec is 1.5x BD speed. As far as I know, that hasn't changed for 3D. Which means, you're basically splitting the usable bandwidth in half :\ Well I would assume compression would actually reduce a ton of that since so many parts of the scene will be the same between eyes ... but it's still frustrating.

Since you're already forcing people to buy a new player, might as well make sure you aren't actually losing fidelity in the process. Similarly, disc-size is of a concern. Adding size to the movie could mean over-compression or a loss of extras for 3D titles.

Between a bit of a higher transfer rate and 36GB layers, I think they'd be fine. I really hope the lack of a transfer speed increase wasn't because of Sony 'needing' the PS3 to be 3D compatible :\

Then again, the PS3 is a 2xBD. So they should have frigging put the 3D spec to at least 2xBD.

*knocks over comic book rack*
 

Raistlin

Post Count: 9999
msdstc said:
The house I watch true blood at has this on... it looks so tacky, almost like a home video camera. I hope this is a trend that never catches on.

It already has.

The good news is you can turn it off. Even better, many are customizable, so its possible you may like other settings ... not to mention that actual algorithms vary between manufacturers. Interpolation between TV's is not automatically apples to apples.
 

C.Dark.DN

Banned
moniker said:
Most of that video seemed to be 24fps (or max 30). I don't understand German, but I see "120 fps" is mentioned. So they shot it at 120 fps and threw away most of the frames in post processing? What's the point of that?
I really don't know enough to continue this conversation. lol.
 

Raistlin

Post Count: 9999
DonMigs85 said:
Yeah it's insane. To think we lived with the NTSC standard for over 50 years, and now they keep shoving new technologies in our face.
NTSC is a broadcast standard. We've only seen one new one, ATSC. And the 3D broadcasters will use is already part of it. If you simply mean to comment on how much TV tech has been changing over the last one or two decades, I can understand that.

Though if I were to nit-pic, using '50 years' isn't entirely fair either - even when limiting the conversion to TV tech advances. Certainly the advent of transistors and competing display techs (plasma, LCD, LCoS, DLP), coupled with commoditized manufacturing has spurred a huge rate of advancement. But it isn't like things were completely stagnant beforehand. Prior to HD, there were still some big changes during that period. We saw the transition from B&W to color, increasingly flatter tubes, Sony's aperture grille ('trinitron tube'), ever increasing picture fidelity (contrast, black-levels, color reproduction, etc.).
 
Anyone every play games on a 120hz tv?

My cousin bought a 55" Sony and a lot of games look really bad on it when he leaves 120hz turned on.

Castlvania LOS is pretty much unplayable. Flashes white sporadically.
 
Whipped Spartan said:
Anyone every play games on a 120hz tv?

My cousin bought a 55" Sony and a lot of games look really bad on it when he leaves 120hz turned on.

Castlvania LOS is pretty much unplayable. Flashes white sporadically.
huh, interesting bump

anyways, i would imagine 120hz would lead to terrible input lag. i remember playing sf4 on my friend's hdtv with the option on. it was pretty bad. :|
 

Raistlin

Post Count: 9999
Whipped Spartan said:
Anyone every play games on a 120hz tv?

My cousin bought a 55" Sony and a lot of games look really bad on it when he leaves 120hz turned on.

Castlvania LOS is pretty much unplayable. Flashes white sporadically.

A few issues with gaming.

First and most importantly, by its nature interpolation induces lag. The processors aren't time machines, so it must have both the before and after frames before even beginning generation of interpolated frames. For a non-action game - say and RPG or something - that's fine, but other than that there are going to be issues.

Secondly, depending on the algorithm framerate drops may not be handled well. It could be quite distracting.
 

Somnid

Member
This thread is kinda interesting especially from the point of view of visual technology. Most people seem to agree it looks more "real" but hate it which just flies in the face of where the industry wants to go.

Personally I like it. Very surreal effect, makes things look cool like I'm watching a play.
 

Cartman86

Banned
Does anyone know of any way (on my current computer monitor) to see the differences between what Cameron wants to do with film frame rate?
 
louis89 said:
How come everyone agrees that on movies it's shit, but the same cheap soap opera effect on video games is good?
Y'know, if a movie was actually shot on film at 60fps, I'd like to see that. However, the post processing effect these TVs do doesn't look right. Quick movements stay at 30FPS while slow ones and panning turn into 60FPS. It is really jarring seeing some on-screen elements moving at 30FPS while others move at a glitchy 60FPS. That's what this effect does.
 
72hz mode on the Kuro looks amazing, it's the perfect balance of smoothness without the terrible "video camera" look these 120hz sets have.
 

Raistlin

Post Count: 9999
Cartman86 said:
Does anyone know of any way (on my current computer monitor) to see the differences between what Cameron wants to do with film frame rate?
While not the exact framerates in question, check out the OP's videos here.






MWS Natural said:
72hz mode on the Kuro looks amazing, it's the perfect balance of smoothness without the terrible "video camera" look these 120hz sets have.
It's like people don't even read the thread they're posting in :\
 
Somnid said:
This thread is kinda interesting especially from the point of view of visual technology. Most people seem to agree it looks more "real" but hate it which just flies in the face of where the industry wants to go.

Personally I like it. Very surreal effect, makes things look cool like I'm watching a play.
I would like the effect if it actually worked correctly. It just always looks a bit off no matter what TV is doing it because it can't deal with fast motion.
 
I have that motion smoothing stuff set low on my TV. so that it doesn't make stuff look like it was shot with a handycam.

it looks WONDERFUL when turned on high for CG animated films (Pixar stuff, for example).
 
Status
Not open for further replies.
Top Bottom