Yep - it's like how people get used to seeing 4:3 signal stretched to 16:9 TV as "normal" (takes about a week of viewing).mrklaw said:i.e is the effect simply your brain seeing 'TV' where it expects to see 'movie'? If all movies from now on were shot and shown at 60fps, would you eventually get used to it and that would become normal?
StuBurns said:There seems to be two discussions going on. Higher framerate capture and higher frame refreshing.
Higher framerate in films, I want.
Lame 120Hz smooth-poop-generator, I don't.
If Cameron gets people to change their wicked ways, he will be solidified as a God amongst men.
Shin Johnpv said:Its not going to happen. Right now doing films, specially Visual FX heavy, or animated ones, at 60fps is just way way way too time consuming and costly.
The amount of people in here who think 120hz is the same thing as interpolation is making me sick.LCfiner said:the 120 hz motion flow defenders in here are making me sick. :lol
Zefah said:You don't need to know how the technology works in order to know that you don't like it's effect on your TV's picture.
I read the OP again. I don't see how agreeing with the OP's sentiment that having TV's set to utilize motion interpolation technology makes most movies look worse is a bad reply worth of getting the thread locked.
That answer is no ... or at least not necessarily.Koodo said:Oh ok, but you didn't answer my most pressing question (which I intentionally put near the top and in a line of its own so people would not ignore it :lol).
It is going to happen (48fps at first probably). For live action shooting on digital cameras becomes standard over Super 35, it's not going to cost much more beside effects work, and if Cameron is doing it, everyone else will too.Shin Johnpv said:Its not going to happen. Right now doing films, specially Visual FX heavy, or animated ones, at 60fps is just way way way too time consuming and costly.
You are understating just how much effects work can cost. Will some hop on board? Sure, that doesn't mean it will become the norm. Not everyone is afforded the budgets of a Cameron movie.StuBurns said:It is going to happen (48fps at first probably). For live action shooting on digital cameras becomes standard over Super 35, it's not going to cost much more beside effects work, and if Cameron is doing it, everyone else will too.
Because that would have been an obscene waste of cash. It's like saying why wasn't some random episode of Top of the Pops shot in 70mm back in the sixties. It's because there was no call for it.Raistlin said:You are understating just how much effects work can cost. Will some hop on board? Sure, that doesn't mean it will become the norm. Not everyone is afforded the budgets of a Cameron movie.
Look at it this way, the reason Buffy isn't on BD is because the effects in the first two seasons were implemented for SD interlaced displays. They will need to be redone if we want the series to hit HD. Why did they do something so silly at the onset? Cost.
There's a difference between content filmed and sent to your TV at 60fps, which doesn't really exist, and things interpolated to 60fps and above. So, in fact, outside of gaming, we really don't know how native 60fps footage looks.Alx said:Like most people today, I dislike how movies look with high framerates. But I wonder how much of it is an acquired taste. Sure low framerate looks more fictional, but maybe our brain labels it as such because it's been fed fictions at 24 fps for years.
If you're purely rational, 100/120 fps is more realistic, more fluid, closer to reality... so it's a better format for a video stream. It feels strange (for us) but it's technically better. And I suppose that you could still "emulate" a lower framerate on a high fps stream.
Not necessarily. The point is that effects are many times gimped to the LCD even when the film stock used could warrant higher resolution. It's actually very common, and why a number of shows haven't hit BD.StuBurns said:Because that would have been an obscene waste of cash. It's like saying why wasn't some random episode of Top of the Pops shot in 70mm back in the sixties. It's because there was no call for it.
Obviously those tend to be bigger budget movies. Regardless, I suspect 48Hz or especially 60Hz effects would cost a lot more than 3D.Put it this way, everyone who's considering doing anything in 3D has happily doubled their effects budget, I've yet to hear anyone complain 3D is 'too expensive'.
Actually I posted a link to comparison footage (http://www.avsforum.com/avs-vb/showthread.php?t=1069482) :\Mr. Wonderful said:There's a difference between content filmed and sent to your TV at 60fps, which doesn't really exist, and things interpolated to 60fps and above. So, in fact, outside of gaming, we really don't know how native 60fps footage looks.
This I'll certainly agree with. It is maddening.This thread is going in circles, however.
It works, holy shit 24fps looks pretty shitty, funny thing is if you cover the 60fps side it doesn't look that bad.Raistlin said:Actually I posted a link to comparison footage (http://www.avsforum.com/avs-vb/showthread.php?t=1069482) :\
Granted I'm at work so I can't verify whether they are still be hosted. If not, I can upload them when I'm back from my business trip.
Mr. B Natural said:Why the soap opera hate? I like my stories.
Seda said:So the advantage of buying a 120hz over 60hz is basically to eliminate "judder" on 24fps content (Like DVD/BD movies)? (since 24 divides evenly in 120 but not into 60?)? Is judder the right term?
I'm a plasma guy anyway so I really haven't payed much attention to this stuff.
I think you misunderstood both points. The point of not paying for 4K effect work on TV is because no one could use it, it's literally just throwing money away. And it doesn't stop them releasing anything on BluRay, they just choose not too, Firefly also has SD effects and it came out on BluRay.Raistlin said:Not necessarily. The point is that effects are many times gimped to the LCD even when the film stock used could warrant higher resolution. It's actually very common, and why a number of shows haven't hit BD.
Obviously those tend to be bigger budget movies. Regardless, I suspect 48Hz or especially 60Hz effects would cost a lot more than 3D.
Post production on movies/shows, particularly effects-laden ones, is where much of the budget and time is spent. Doubling that work is hardly something to sneeze at. Please understand I'm all for increased framerates - I just don't have illusions of it becoming the norm.
The principal is the same. Yeah, obviously you wouldn't pay for 4K effects on a TV show. However, for something that is shot on stock where an HD release makes sense ... it would be nice to have the effects match it.StuBurns said:I think you misunderstood both points. The point of not paying for 4K effect work on TV is because no one could use it, it's literally just throwing money away. And it doesn't stop them releasing anything on BluRay, they just choose not too, Firefly also has SD effects and it came out on BluRay.
That isn't the case in all instances. For example, look at CGI effects (ie most). For 3D they need to simply move the virtual camera and re-render. That doubles rendering time, but involves little in the way of actual extra work. Doubling the frame rate doubles the actual man-work since in many instances the motion is hand-animated.As for the 3D, 3D is doubling the work, exactly the same as 48fps. It is 48fps. Instead of an iteration ever 48th of a second, it's two every 28th of a second.
A HD release didn't make sense, that is my point.Raistlin said:The principal is the same. Yeah, obviously you wouldn't pay for 4K effects on a TV show. However, for something that is shot on stock where an HD release makes sense ... it would be nice to have the effects match it.
My point is that they do not always do this. Why? Because it is costly. Citing one show that did get released even with inferior effects doesn't really argue the point. That said, the case of Buffy is in a different class. The budget was far lower, and that includes the effects issues being more severe.
That isn't the case in all instances. For example, look at CGI effects (ie most). For 3D they need to simply move the virtual camera and re-render. That doubles rendering time, but involves little in the way of actual extra work. Doubling the frame rate doubles the actual man-work since in many instances the motion is hand-animated.
Oh god the render times that would take.BotoxAgent said:CGI effects in 60fps in IMAX resolution would be gdlk :lol
It would still take longer to render. Just because something's not hand animated doesn't mean it'll be any faster, especially for simulations. Render times and allocations on the render farm is a big part of the costs for each shot.StuBurns said:A HD release didn't make sense, that is my point.
Name a TV show today shot on 35mm that doesn't feature HD post-production, then you'll have a vague argument.
Hand animated CG effects? Care to name some? Even if you can, just because someone key frames 24 frames every second, doesn't mean they have to start key framing 48, they could interpolated the missing frames, i.e. no extra work.
XiaNaphryz said:Oh god the render times that would take.
It would still take longer to render. Just because something's not hand animated doesn't mean it'll be any faster, especially for simulations.
But would it take longer than rendering in 3D? That is the debate.XiaNaphryz said:It would still take longer to render. Just because something's not hand animated doesn't mean it'll be any faster, especially for simulations.
Yeah I'm pretty sure I saw this with a demo unit playing Avatar in Costco. I couldn't figure out why it looked so weird to me but the effect was quite similar to what the OP describes.Dechaios said:I saw that effect in an electronics store and the tv was playing Avatar. Never wanna see Avatar look so weird again...
Cheap according to who? Our renderfarm is massive and upgraded all the time, yet we can still manage to max it out with all the work we have to do. Allocation time on the renderfarm is always something that needs to be planned ahead of time.otake said:CPU cycles are cheap, so be it.
Anything that needs to go through a renderer like Renderman or Mental Ray, be it a polygonal model or digital composite or particle effect or water sim or digital explosion, if there's double the frames to render it will most likely take twice as long. So yes, mono 48 would likely take the same amount of time as stereo 24, but there could be some time savings in the stereo renders due to some stuff you can do with caching since the frames will be really similar.StuBurns said:But would it take longer than rendering 3D, that is the debate.
So 3D at 24 is basically the same in terms of work load as a theoretical 2D 48fps film?XiaNaphryz said:Anything that needs to go through a renderer like Renderman or Mental Ray, be it a polygonal model or digital composite or particle effect or water sim or digital explosion, if there's double the frames to render it will most likely take twice as long.
Interestingly ... you're the one that brought up the showStuBurns said:A HD release didn't make sense, that is my point.
Name a TV show today shot on 35mm that doesn't feature HD post-production, then you'll have a vague argument.
The Video: Sizing Up the Picture
Firefly: The Complete Series makes its long-awaited debut on Blu-ray with a somewhat mediocre 1080p/AVC-encoded transfer that fails to rejuvenate the series problematic source. Close-ups and practical shots look quite impressive (more on that in a bit), but special effects sequences are soft (downright blurry at times), long distance pans are muddled, and texture clarity is a tad inconsistent. Fans who own Serenity in HD will be particularly disappointed since the films detailed vistas and spacecraft sparkle in high definition compared to Firefly. Some of the series high-def issues can be traced back to the shows limited budget and rushed production schedule, but the most distracting shots are a direct result of source limitations. While Whedon shot the majority of the series using 35mm film stock, special effects sequences were minted in lowly standard definition. Honestly, I have a hard time faulting the production team for saving cash and making the series look as good as they could at a time when HDTV was a pipe dream, but it doesnt change the fact that the BD edition of Firefly is uneven and, at times, painfully underwhelming.
A ton of Pixar's work includes hand animation. They aren't using physics models for everything.Hand animated CG effects? Care to name some? Even if you can, just because someone key frames 24 frames every second, doesn't mean they have to start key framing 48, they could interpolated the missing frames, i.e. no extra work.
viakado said:is this 120hz the same thing as 600hz Sub-field Drive thats advertised on plasmas?
it gives the same effects?
i mean that soap opera look?
Raistlin said:Interestingly ... you're the one that brought up the show
The same issue was present in Buffy, and tons of other shows.
http://bluray.highdefdigest.com/1271/firefly.html
Either way we seem to be getting further and further away from the point and nit-picking things ancillary to the issue at hand.
The original posit was that high framerate filming would become the norm. I'm arguing that for cost reasons, don't count on it. Especially for TV. Even with film, the 'norm' does not equal Hollywood blockbuster releases.
A ton of Pixar's work includes hand animation. They aren't using physics models for everything.
That said, aren't you arguing for the use of motion interpolation at least for effects? This thread is circlingthe drain lol
As I've said though, even assuming the costs are the same (which isn't always the case) ... again ... 3D is not the norm ... because of costs :lol
StuBurns said:Firstly, the Firefly thing says pretty much my whole point, "HDTV was a pipe dream", so there is no point wasting budget on it.
Pixar (and the like) are animating CG characters, we were talking about effects added to live action footage. Pixar films and their ilk would certainly feel the growing pains of increasing framerate, more than anyone I'd imagine. (For an example of 60fps CG vaguely in that style, check out Blur's BioShock Infinite trailer, there are 60fps versions around).
As for the additional cost in general, I just think you're wrong, it would cost more, you haven't proven it would cost more than 3D, but that's beside the point. The CG effects in most normal TV shows is virtually nothing beyond set extension which is a static addition essentially, and often literally nothing. Any normal live action TV show, shot on with digital cameras, could go to 48fps and have almost no budgetary affect I imagine.
24fps is fine, we're used to it, but people talking about going to 4K before we even up the refill rate seems crazy to me. 48fps would have an actual affect on the way things are filmed, the style of cutting action most prevalently, 4K is just making something pretty a little bit more pretty.
If you mean starting a month from today, then I agree, it won't. If you mean in ten years, you'd be shopping for an easily digestible hat.Raistlin said:Talking about missing the point ...
Fine let me put it this way - I'll eat my hat if 'the norm' for movies is high framerates in the next decade.
StuBurns said:If you mean starting a month from today, then I agree, it won't. If you mean in ten years, you'd be shopping for an easily digestible hat.
so whats the hoopla on that 600hz mega super duper drive thing?Raistlin said:
I think ten years from now almost every major film release will be in a higher framerate than 24, there will people who won't ever move from Super 35, there films will have to remain the same I'd imagine.Raistlin said:So you're saying in 20 years then?
I don't understand why our discussion needed to meander the way it has. You initially stated you thought high framerates would be the norm, I disagree due to costs. The rest was us throwing feces on the wall. :lol
Xeke said:My friends dad bought a new HD tv and we sat down to watch it and couldn't figure out what the hell was wrong, everything looked like a soap opera.:lol It took us a few minutes to figure out how to fix it, it was terrible.
Since the technologies are totally different, it's apples and oranges for the most part.viakado said:so whats the hoopla on that 600hz mega super duper drive thing?
whats the visual difference?
StuBurns said:I think ten years from now almost every major film release will be in a higher framerate than 24, there will people who won't ever move from Super 35, there films will have to remain the same I'd imagine.
While a point of contention was whether or not it will cost more than 3D, you never argued it would cost less. So let's say they are equal? That still favors my argument. If 3D was cheap, other than the directors that object to it personally, all movies would use it. They don't - and it's known to be a cost issue.I disagree the discuss was just throwing shit, I think the budget thing is inaccurate that's all. But I'm certainly the one on the radical end of things, framerate hasn't increased since the twenties (I think that's right?), to think it'll happen as a result of Avatar 2 is quite a gamble. There are a lot of people who don't think 3D will catch on this time, although I'd argue that's a bigger change.
I never said it shouldn't. Much the opposite. Actually I'd argue it's probably more important for 3D. Just think of those costs thoughWe'll see either way, I think it will, and more importantly, should.