• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

120hz Movies: How can people watch this shit?

Status
Not open for further replies.

StuBurns

Banned
There seems to be two discussions going on. Higher framerate capture and higher frame refreshing.

Higher framerate in films, I want.
Lame 120Hz smooth-poop-generator, I don't.

If Cameron gets people to change their wicked ways, he will be solidified as a God amongst men.
 

Pennybags

Member
I normally hate the effect, but the Toy Story films are served well by it.

The early Tron footage looked amazing as well.

As long as it's CG, it's cool with me.
 

Z_Y

Member
It's like watching a Paper Mario-ish version of the movie. I can't stand it myself. Some people eat it up though because they think that is how HD/BR is supposed to look. :-|
 

Fafalada

Fafracer forever
mrklaw said:
i.e is the effect simply your brain seeing 'TV' where it expects to see 'movie'? If all movies from now on were shot and shown at 60fps, would you eventually get used to it and that would become normal?
Yep - it's like how people get used to seeing 4:3 signal stretched to 16:9 TV as "normal" (takes about a week of viewing).
Which is why frame-interpolation on TVs being on by default is a good thing. It's training the masses to expect higher FPS so eventually we may actually get 60fps+ as standard in Movie theaters.
 
StuBurns said:
There seems to be two discussions going on. Higher framerate capture and higher frame refreshing.

Higher framerate in films, I want.
Lame 120Hz smooth-poop-generator, I don't.

If Cameron gets people to change their wicked ways, he will be solidified as a God amongst men.

Its not going to happen. Right now doing films, specially Visual FX heavy, or animated ones, at 60fps is just way way way too time consuming and costly.
 
Shin Johnpv said:
Its not going to happen. Right now doing films, specially Visual FX heavy, or animated ones, at 60fps is just way way way too time consuming and costly.


CGI effects in 60fps in IMAX resolution would be gdlk :lol
 
LCfiner said:
the 120 hz motion flow defenders in here are making me sick. :lol
The amount of people in here who think 120hz is the same thing as interpolation is making me sick.
I suppose it's due to bad marketing tho, when they have the demos running that show interpolation on and off it always says something like "120hz motion plus on" or whatever.

The effect that the OP hates has nothing to do with 120hz therefor the thread title isn't even right. it's just a feature in most 120hz sets that can be turned on or off, but the tv always has the same 120hz refresh rate either way.
 

Seda

Member
I think the Motionflow/automotion stuff looks great for sports and nature shows.

Terrible for movies/ network shows.
 

Raistlin

Post Count: 9999
Zefah said:
You don't need to know how the technology works in order to know that you don't like it's effect on your TV's picture.
I read the OP again. I don't see how agreeing with the OP's sentiment that having TV's set to utilize motion interpolation technology makes most movies look worse is a bad reply worth of getting the thread locked.

blarg. Read more than the OP. That's my point.

The useful info in this thread is getting lost in the noise. There has also been numerous other threads on the topic. I digress - I'll crap out some of the details below.




Koodo said:
Oh ok, but you didn't answer my most pressing question (which I intentionally put near the top and in a line of its own so people would not ignore it :lol).
That answer is no ... or at least not necessarily.

What's being lost in the circle of doom here is:

  • 120Hz refresh in-and-of-itself doesn't mean anything in terms of motion. Equating 120Hz to the 'soap opera effect' is a complete fallacy (to make things easier, lets call that effect SOE for short). Refresh and framerate are not the same thing. Refresh purely refers to the frequency in which the display redraws. 120Hz TV can display 30fps content. It simply repeats the frames 4 times each, etc. Think of the refresh rate as more of a maximum available framerate - or more accurately a definition of frame hold-times. I'll get into that.
  • One category of post-processing possible with 120Hz refreshes is dedicated to improving temporal resolution. There are a million names used for it, but the point is to lower the hold time of the image*. This typically uses black frame or contrast-adjusted frame insertion. Neither of which cause SOE. The only negative it can produce is a loss of brightness.
  • Where SOE comes from is frame interpolation. This is used to de-judder the image (and also improves temporal resolution as a side-effect since it is lowering the hold time). As a high level example of what is happening here ... pretend the original content has frames that go like this: A..B..C..D. What frame interpolation could do is display this as A.AB.B.BC.C.CD.C. Look at it as inserting new frames between the originals in order to blend between them. It appears to increase the framerate.

Where people seem to really get lost in that last one is that basically every manufacturer does it differently (I should note that generating the motion vectors, etc. to do things right is not nearly as simple as the 'blended' frames example implies), and most TV's offer numerous configuration for said algorithm(s). It is quite possible that someone could have one of the low settings on, and you wouldn't even necessarily notice it. There would be no SOE, but if you watched it side-by-side with a normal TV you would see the reduction in judder, etc.

Pursuant to your main question, yes interpolation can be used to give the appearance of a higher framerate. However by definition it would induce lag since it is a purely post-processing effect. It needs to wait for the two consecutive frames, and only then can it generate the in between frame(s) and start displaying things. It is inherently buffering data. For something like an RPG, that's fine. For action games, not so much. It should also be pointed out that many algorithms can be finicky when handling fluctuating framerates. Even if it does handle them well, the frame drops can become more apparent with interpolation.








* Hold time refers to how long a given frame is displayed before the next is shown. The issue here is that LCD (and current OLED) uses a display technique known as sample-and-hold. Basically it displays a frame - and leaves it there until the next frame is needed. The initial frame is there the entire duration in between.

Why does this matter? The blur many people associate with LCD's - which is actually low temporal resolution - isn't technically happening on the TV. It's in our brains. The way our brain interprets motion is a complicated beast. If an image is 'held' for a certain duration, it basically gets burnt into our brain for a short time. When that next frame is then displayed, our brain is actually blurring them together ... not the TV.

Who's fault it is doesn't really matter. It is a side-effect of the display method. CRT, DLP, Plasma, etc do not use sample-and-hold, which is why they have superior motion handling by default. Their hold times are inherently shorter than the full frame LCD normally uses. Stuff like black/contrast-adjusted frame insertion, as well as de-judder effectively lower the hold time ... improving temporal resolution. All this stuff is basically chasing what other techs already handle decently by design. Well de-judder has other uses, but you get the point.
 

StuBurns

Banned
Shin Johnpv said:
Its not going to happen. Right now doing films, specially Visual FX heavy, or animated ones, at 60fps is just way way way too time consuming and costly.
It is going to happen (48fps at first probably). For live action shooting on digital cameras becomes standard over Super 35, it's not going to cost much more beside effects work, and if Cameron is doing it, everyone else will too.
 

Raistlin

Post Count: 9999
StuBurns said:
It is going to happen (48fps at first probably). For live action shooting on digital cameras becomes standard over Super 35, it's not going to cost much more beside effects work, and if Cameron is doing it, everyone else will too.
You are understating just how much effects work can cost. Will some hop on board? Sure, that doesn't mean it will become the norm. Not everyone is afforded the budgets of a Cameron movie.

Look at it this way, the reason Buffy isn't on BD is because the effects in the first two seasons were implemented for SD interlaced displays. They will need to be redone if we want the series to hit HD. Why did they do something so silly at the onset? Cost.
 

StuBurns

Banned
Raistlin said:
You are understating just how much effects work can cost. Will some hop on board? Sure, that doesn't mean it will become the norm. Not everyone is afforded the budgets of a Cameron movie.

Look at it this way, the reason Buffy isn't on BD is because the effects in the first two seasons were implemented for SD interlaced displays. They will need to be redone if we want the series to hit HD. Why did they do something so silly at the onset? Cost.
Because that would have been an obscene waste of cash. It's like saying why wasn't some random episode of Top of the Pops shot in 70mm back in the sixties. It's because there was no call for it.

Put it this way, everyone who's considering doing anything in 3D has happily doubled their effects budget, I've yet to hear anyone complain 3D is 'too expensive'.
 
Alx said:
Like most people today, I dislike how movies look with high framerates. But I wonder how much of it is an acquired taste. Sure low framerate looks more fictional, but maybe our brain labels it as such because it's been fed fictions at 24 fps for years.

If you're purely rational, 100/120 fps is more realistic, more fluid, closer to reality... so it's a better format for a video stream. It feels strange (for us) but it's technically better. And I suppose that you could still "emulate" a lower framerate on a high fps stream.
There's a difference between content filmed and sent to your TV at 60fps, which doesn't really exist, and things interpolated to 60fps and above. So, in fact, outside of gaming, we really don't know how native 60fps footage looks.

This thread is going in circles, however.
 

Raistlin

Post Count: 9999
StuBurns said:
Because that would have been an obscene waste of cash. It's like saying why wasn't some random episode of Top of the Pops shot in 70mm back in the sixties. It's because there was no call for it.
Not necessarily. The point is that effects are many times gimped to the LCD even when the film stock used could warrant higher resolution. It's actually very common, and why a number of shows haven't hit BD.

Put it this way, everyone who's considering doing anything in 3D has happily doubled their effects budget, I've yet to hear anyone complain 3D is 'too expensive'.
Obviously those tend to be bigger budget movies. Regardless, I suspect 48Hz or especially 60Hz effects would cost a lot more than 3D.

Post production on movies/shows, particularly effects-laden ones, is where much of the budget and time is spent. Doubling that work is hardly something to sneeze at. Please understand I'm all for increased framerates - I just don't have illusions of it becoming the norm.
 

Raistlin

Post Count: 9999
Mr. Wonderful said:
There's a difference between content filmed and sent to your TV at 60fps, which doesn't really exist, and things interpolated to 60fps and above. So, in fact, outside of gaming, we really don't know how native 60fps footage looks.
Actually I posted a link to comparison footage (http://www.avsforum.com/avs-vb/showthread.php?t=1069482) :\

Granted I'm at work so I can't verify whether they are still be hosted. If not, I can upload them when I'm back from my business trip.



This thread is going in circles, however.
This I'll certainly agree with. It is maddening.
 

DarkKyo

Member
I saw that effect in an electronics store and the tv was playing Avatar. Never wanna see Avatar look so weird again...
 

Raistlin

Post Count: 9999
2San,

That's the rub, a lot in the A/V world is relative. On their own they may be serviceable, but once you see what's actually possible side-by-side, you can never go back.

Sometimes I almost want to jump in bed with the 'it's better to not know' argument :lol
 

.GqueB.

Banned
You know what that shite reminds me of? Extras on a dvd. You know where you get that shot of someone filming a film shoot like on the HBO making ofs? Looks JUST like that.

Fucking hate it.
 

Jim

Member
Mr. B Natural said:
Why the soap opera hate? I like my stories.

Curious what happens when you use one of the frame interpolation modes when watching an actual soap opera. Time travel of some sort?
 

Seda

Member
So the advantage of buying a 120hz over 60hz is basically to eliminate "judder" on 24fps content (Like DVD/BD movies)? (since 24 divides evenly in 120 but not into 60?)? Is judder the right term?

I'm a plasma guy anyway so I really haven't payed much attention to this stuff.

A 120hz tv would redraw each frame 5 times, whereas a 60hz tv redraws alternating frames 2 times then 3 times then 2 times then 3 times etc......
 

Raistlin

Post Count: 9999
Seda said:
So the advantage of buying a 120hz over 60hz is basically to eliminate "judder" on 24fps content (Like DVD/BD movies)? (since 24 divides evenly in 120 but not into 60?)? Is judder the right term?

I'm a plasma guy anyway so I really haven't payed much attention to this stuff.

Judder is actually a super-set.

The judder most people are referring to is telecine judder. That's the irregular motion generated from the the telecine process (24Hz -> 60Hz conversion). A 120Hz TV (if implementing the processing properly) can in fact eliminate telecine judder. Granted, the vast majority only can do that if actually sent a 24Hz signal - they won't detect a film cadence from 60Hz output and reconstruct the 24Hz original. Sadder still, some 120Hz TV's don't even do things right when sent a 24Hz signal.

The other general type of judder is the regular interval jumpiness seen due to the low framerate content is filmed in. For example, during a pan. Even when the telecine (irregular) judder is gone, you can still note that the pan is less than smooth. The de-judder algorithms can help with that (and as a side-effect, improve temporal resolution).
 

StuBurns

Banned
Raistlin said:
Not necessarily. The point is that effects are many times gimped to the LCD even when the film stock used could warrant higher resolution. It's actually very common, and why a number of shows haven't hit BD.


Obviously those tend to be bigger budget movies. Regardless, I suspect 48Hz or especially 60Hz effects would cost a lot more than 3D.

Post production on movies/shows, particularly effects-laden ones, is where much of the budget and time is spent. Doubling that work is hardly something to sneeze at. Please understand I'm all for increased framerates - I just don't have illusions of it becoming the norm.
I think you misunderstood both points. The point of not paying for 4K effect work on TV is because no one could use it, it's literally just throwing money away. And it doesn't stop them releasing anything on BluRay, they just choose not too, Firefly also has SD effects and it came out on BluRay.

As for the 3D, 3D is doubling the work, exactly the same as 48fps. It is 48fps. Instead of an iteration ever 48th of a second, it's two every 28th of a second.
 

Raistlin

Post Count: 9999
StuBurns said:
I think you misunderstood both points. The point of not paying for 4K effect work on TV is because no one could use it, it's literally just throwing money away. And it doesn't stop them releasing anything on BluRay, they just choose not too, Firefly also has SD effects and it came out on BluRay.
The principal is the same. Yeah, obviously you wouldn't pay for 4K effects on a TV show. However, for something that is shot on stock where an HD release makes sense ... it would be nice to have the effects match it.

My point is that they do not always do this. Why? Because it is costly. Citing one show that did get released even with inferior effects doesn't really argue the point. That said, the case of Buffy is in a different class. The budget was far lower, and that includes the effects issues being more severe.

As for the 3D, 3D is doubling the work, exactly the same as 48fps. It is 48fps. Instead of an iteration ever 48th of a second, it's two every 28th of a second.
That isn't the case in all instances. For example, look at CGI effects (ie most). For 3D they need to simply move the virtual camera and re-render. That doubles rendering time, but involves little in the way of actual extra work. Doubling the frame rate doubles the actual man-work since in many instances the motion is hand-animated.

What I wouldn't think needs to be said however is the fact not all movies are shot in 3D (or post-processed to 3D). Why? in most cases cost. If anything you helping my point here.
 

StuBurns

Banned
Raistlin said:
The principal is the same. Yeah, obviously you wouldn't pay for 4K effects on a TV show. However, for something that is shot on stock where an HD release makes sense ... it would be nice to have the effects match it.

My point is that they do not always do this. Why? Because it is costly. Citing one show that did get released even with inferior effects doesn't really argue the point. That said, the case of Buffy is in a different class. The budget was far lower, and that includes the effects issues being more severe.


That isn't the case in all instances. For example, look at CGI effects (ie most). For 3D they need to simply move the virtual camera and re-render. That doubles rendering time, but involves little in the way of actual extra work. Doubling the frame rate doubles the actual man-work since in many instances the motion is hand-animated.
A HD release didn't make sense, that is my point.

Name a TV show today shot on 35mm that doesn't feature HD post-production, then you'll have a vague argument.

Hand animated CG effects? Care to name some? Even if you can, just because someone key frames 24 frames every second, doesn't mean they have to start key framing 48, they could interpolated the missing frames, i.e. no extra work.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
BotoxAgent said:
CGI effects in 60fps in IMAX resolution would be gdlk :lol
Oh god the render times that would take. :(

StuBurns said:
A HD release didn't make sense, that is my point.

Name a TV show today shot on 35mm that doesn't feature HD post-production, then you'll have a vague argument.

Hand animated CG effects? Care to name some? Even if you can, just because someone key frames 24 frames every second, doesn't mean they have to start key framing 48, they could interpolated the missing frames, i.e. no extra work.
It would still take longer to render. Just because something's not hand animated doesn't mean it'll be any faster, especially for simulations. Render times and allocations on the render farm is a big part of the costs for each shot.
 

otake

Doesn't know that "You" is used in both the singular and plural
XiaNaphryz said:
Oh god the render times that would take. :(


It would still take longer to render. Just because something's not hand animated doesn't mean it'll be any faster, especially for simulations.

CPU cycles are cheap, so be it.
 

StuBurns

Banned
XiaNaphryz said:
It would still take longer to render. Just because something's not hand animated doesn't mean it'll be any faster, especially for simulations.
But would it take longer than rendering in 3D? That is the debate.

It's 48fps either way. A mono 48 or a stereo 24.
 

Stalfos

Member
Dechaios said:
I saw that effect in an electronics store and the tv was playing Avatar. Never wanna see Avatar look so weird again...
Yeah I'm pretty sure I saw this with a demo unit playing Avatar in Costco. I couldn't figure out why it looked so weird to me but the effect was quite similar to what the OP describes.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
otake said:
CPU cycles are cheap, so be it.
Cheap according to who? Our renderfarm is massive and upgraded all the time, yet we can still manage to max it out with all the work we have to do. Allocation time on the renderfarm is always something that needs to be planned ahead of time.

StuBurns said:
But would it take longer than rendering 3D, that is the debate.
Anything that needs to go through a renderer like Renderman or Mental Ray, be it a polygonal model or digital composite or particle effect or water sim or digital explosion, if there's double the frames to render it will most likely take twice as long. So yes, mono 48 would likely take the same amount of time as stereo 24, but there could be some time savings in the stereo renders due to some stuff you can do with caching since the frames will be really similar.
 

StuBurns

Banned
XiaNaphryz said:
Anything that needs to go through a renderer like Renderman or Mental Ray, be it a polygonal model or digital composite or particle effect or water sim or digital explosion, if there's double the frames to render it will most likely take twice as long.
So 3D at 24 is basically the same in terms of work load as a theoretical 2D 48fps film?

As someone who had access to lots of Avatar fun, did you see the high framerate tests Cameron talked about? If you're allowed to say either way. Outside of the few IMAX HD things most people have never seen, the public really hasn't had access to high framerate films, mastered with proper direction etc.
 

Raistlin

Post Count: 9999
StuBurns said:
A HD release didn't make sense, that is my point.

Name a TV show today shot on 35mm that doesn't feature HD post-production, then you'll have a vague argument.
Interestingly ... you're the one that brought up the show ;)

The same issue was present in Buffy, and tons of other shows.

http://bluray.highdefdigest.com/1271/firefly.html

The Video: Sizing Up the Picture

’Firefly: The Complete Series’ makes its long-awaited debut on Blu-ray with a somewhat mediocre 1080p/AVC-encoded transfer that fails to rejuvenate the series’ problematic source. Close-ups and practical shots look quite impressive (more on that in a bit), but special effects sequences are soft (downright blurry at times), long distance pans are muddled, and texture clarity is a tad inconsistent. Fans who own ‘Serenity’ in HD will be particularly disappointed since the film’s detailed vistas and spacecraft sparkle in high definition compared to ‘Firefly.’ Some of the series’ high-def issues can be traced back to the show’s limited budget and rushed production schedule, but the most distracting shots are a direct result of source limitations. While Whedon shot the majority of the series using 35mm film stock, special effects sequences were minted in lowly standard definition. Honestly, I have a hard time faulting the production team for saving cash and making the series look as good as they could at a time when HDTV was a pipe dream, but it doesn’t change the fact that the BD edition of ‘Firefly’ is uneven and, at times, painfully underwhelming.

Either way we seem to be getting further and further away from the point and nit-picking things ancillary to the issue at hand.

The original posit was that high framerate filming would become the norm. I'm arguing that for cost reasons, don't count on it. Especially for TV. Even with film, the 'norm' does not equal Hollywood blockbuster releases.

Hand animated CG effects? Care to name some? Even if you can, just because someone key frames 24 frames every second, doesn't mean they have to start key framing 48, they could interpolated the missing frames, i.e. no extra work.
A ton of Pixar's work includes hand animation. They aren't using physics models for everything.

That said, aren't you arguing for the use of motion interpolation at least for effects? This thread is circling
the drain lol :p

As I've said though, even assuming the costs are the same (which isn't always the case) ... again ... 3D is not the norm ... because of costs :lol
 
I saw iRobot playing in this mode at Futureshop. It actaully breaks the blurred line between what's real and what is CG. Anything computer generated stands out like a sore thumb.
 

StuBurns

Banned
Raistlin said:
Interestingly ... you're the one that brought up the show ;)

The same issue was present in Buffy, and tons of other shows.

http://bluray.highdefdigest.com/1271/firefly.html



Either way we seem to be getting further and further away from the point and nit-picking things ancillary to the issue at hand.

The original posit was that high framerate filming would become the norm. I'm arguing that for cost reasons, don't count on it. Especially for TV. Even with film, the 'norm' does not equal Hollywood blockbuster releases.


A ton of Pixar's work includes hand animation. They aren't using physics models for everything.

That said, aren't you arguing for the use of motion interpolation at least for effects? This thread is circling
the drain lol :p

As I've said though, even assuming the costs are the same (which isn't always the case) ... again ... 3D is not the norm ... because of costs :lol

Firstly, the Firefly thing says pretty much my whole point, "HDTV was a pipe dream", so there is no point wasting budget on it.

Pixar (and the like) are animating CG characters, we were talking about effects added to live action footage. Pixar films and their ilk would certainly feel the growing pains of increasing framerate, more than anyone I'd imagine. (For an example of 60fps CG vaguely in that style, check out Blur's BioShock Infinite trailer, there are 60fps versions around).

As for the additional cost in general, I just think you're wrong, it would cost more, you haven't proven it would cost more than 3D, but that's beside the point. The CG effects in most normal TV shows is virtually nothing beyond set extension which is a static addition essentially, and often literally nothing. Any normal live action TV show, shot on with digital cameras, could go to 48fps and have almost no budgetary affect I imagine.

24fps is fine, we're used to it, but people talking about going to 4K before we even up the refill rate seems crazy to me. 48fps would have an actual affect on the way things are filmed, the style of cutting action most prevalently, 4K is just making something pretty a little bit more pretty.
 

Raistlin

Post Count: 9999
StuBurns said:
Firstly, the Firefly thing says pretty much my whole point, "HDTV was a pipe dream", so there is no point wasting budget on it.


Pixar (and the like) are animating CG characters, we were talking about effects added to live action footage. Pixar films and their ilk would certainly feel the growing pains of increasing framerate, more than anyone I'd imagine. (For an example of 60fps CG vaguely in that style, check out Blur's BioShock Infinite trailer, there are 60fps versions around).

As for the additional cost in general, I just think you're wrong, it would cost more, you haven't proven it would cost more than 3D, but that's beside the point. The CG effects in most normal TV shows is virtually nothing beyond set extension which is a static addition essentially, and often literally nothing. Any normal live action TV show, shot on with digital cameras, could go to 48fps and have almost no budgetary affect I imagine.

24fps is fine, we're used to it, but people talking about going to 4K before we even up the refill rate seems crazy to me. 48fps would have an actual affect on the way things are filmed, the style of cutting action most prevalently, 4K is just making something pretty a little bit more pretty.

Talking about missing the point ...

Fine let me put it this way - I'll eat my hat if 'the norm' for movies is high framerates in the next decade.
 

StuBurns

Banned
Raistlin said:
Talking about missing the point ...

Fine let me put it this way - I'll eat my hat if 'the norm' for movies is high framerates in the next decade.
If you mean starting a month from today, then I agree, it won't. If you mean in ten years, you'd be shopping for an easily digestible hat.
 

Raistlin

Post Count: 9999
StuBurns said:
If you mean starting a month from today, then I agree, it won't. If you mean in ten years, you'd be shopping for an easily digestible hat.

So you're saying in 20 years then? :D

I don't understand why our discussion needed to meander the way it has. You initially stated you thought high framerates would be the norm, I disagree due to costs. The rest was us throwing feces on the wall. :lol
 

StuBurns

Banned
Raistlin said:
So you're saying in 20 years then? :D

I don't understand why our discussion needed to meander the way it has. You initially stated you thought high framerates would be the norm, I disagree due to costs. The rest was us throwing feces on the wall. :lol
I think ten years from now almost every major film release will be in a higher framerate than 24, there will people who won't ever move from Super 35, there films will have to remain the same I'd imagine.

I disagree the discuss was just throwing shit, I think the budget thing is inaccurate that's all. But I'm certainly the one on the radical end of things, framerate hasn't increased since the twenties (I think that's right?), to think it'll happen as a result of Avatar 2 is quite a gamble. There are a lot of people who don't think 3D will catch on this time, although I'd argue that's a bigger change.

We'll see either way, I think it will, and more importantly, should.
 

dorkimoe

Gold Member
Xeke said:
My friends dad bought a new HD tv and we sat down to watch it and couldn't figure out what the hell was wrong, everything looked like a soap opera.:lol It took us a few minutes to figure out how to fix it, it was terrible.

so thats what that is...ill have to turn this off on my moms tv. Everyones mouths are out of sync with the audio. Ill have to look in the settings
 

Raistlin

Post Count: 9999
viakado said:
so whats the hoopla on that 600hz mega super duper drive thing?
whats the visual difference?
Since the technologies are totally different, it's apples and oranges for the most part.


The question is why advertise it then? It's a numbers game ... basically dick waving Hz values. I wouldn't get too mad at the Plasma makers for it though, they were pushed into it by LCD makers. LCD manufacturers started advertising 120Hz and 240Hz as a prominent feature, as though it made it better than Plasma automatically. While one can argue the merits of motion interpolation/de-judder (which some plasmas now offer well), the other feature these sets are promoting is the increased motion resolution. The problem is, it's not like it's an advantage over Plasma. Much the opposite, it's LCD's upward climb to simply match Plasma motion handling (which it hasn't).

LCD took what could be argued as a disadvantage and marketed such that it appears to be a feature to J6P. Plasma makers were simply forced onto the PR shit train
 

Raistlin

Post Count: 9999
StuBurns said:
I think ten years from now almost every major film release will be in a higher framerate than 24, there will people who won't ever move from Super 35, there films will have to remain the same I'd imagine.

The problem is what do you mean by 'major' release? Hollywood blockbusters? Basically the movies with a 80+ million dollar budgets? As I said, I agree with that ... the problem is that isn't the majority of film releases. It may seem that way due to advertising, but 99% of advertising dollars go to like 5% of movies.

I disagree the discuss was just throwing shit, I think the budget thing is inaccurate that's all. But I'm certainly the one on the radical end of things, framerate hasn't increased since the twenties (I think that's right?), to think it'll happen as a result of Avatar 2 is quite a gamble. There are a lot of people who don't think 3D will catch on this time, although I'd argue that's a bigger change.
While a point of contention was whether or not it will cost more than 3D, you never argued it would cost less. So let's say they are equal? That still favors my argument. If 3D was cheap, other than the directors that object to it personally, all movies would use it. They don't - and it's known to be a cost issue.

We'll see either way, I think it will, and more importantly, should.
I never said it shouldn't. Much the opposite. Actually I'd argue it's probably more important for 3D. Just think of those costs though :p
 
Status
Not open for further replies.
Top Bottom