Phire Phox
Banned
Yay! I can now see my Browns get whooped in 60 FPS! :/
Guess I just take this for granted, but yeah we've been there for some time.
edit: You one of the bigger problems is all that motion artifacts when things start moving at a high speed. Always been an issue.
I'm not sure where you are getting your info from, but North American television broadcasts are 30 frames per second, not 60. Could you source where you're getting this 60fps from?
I'm not sure where you are getting your info from, but North American television broadcasts are 30 frames per second, not 60. Could you source where you're getting this 60fps from?
I don't know if you're confused or getting into a semantic argument or what, but let me try to clarify the original explanation.
Like I mentioned before, most stations broadcast in 720p60 or 1080i60.
In both cases you can push 24FPS, 30FPS and 60FPS content through.
Obviously there's judder thats introduced with 24FPS unless you have a 120hz panel which is smart enough to recognize the 24FPS content and display it correctly. On a 720p broadcast it's already a progressive signal. On a 1080i60 broadcast, it comes in as interlaced but all of the information to reconstruct it as a 1080p video is there. Modern TV's are able to apply some processing to get it to it's original progressive state-- there's no information lost. So even thou it's a 1080i60 broadcastyou're essentially seeing a 1080p24 brodcast on a 1080p panel.
For 30FPS it's pretty much the same deal except there's no judder introduced since 30 fits into 60hz evenly (every frame is displayed twice).
For 60FPS on a 720p signal, it's straight forward, every frame is a unique progressive image.
For 60FPS on a 1080i signal, the way it works is that there are 60 frames of information, but the even frames only contain image information for horizontal lines 1, 3, 5, 7, ... 1077, 1079 and the odd frames contain image information for horizontal lines 2, 4, 6, 8, .... 1078, 1080. On an old school CRT HDTV what would happen is that it literally would display only the odd lines one frame, and then the even lines the next frame. Since the lines that information was being displayed on were alternating so fast, it would trick your brain into thinking you were seeing a progressive image and it would look pretty good. On a fixed pixel display (like LCD) they can't really do that-- you'd see way too much flicker from the image and it would look bad. So instead they have algorithms which for the even frames will fill in the information for lines 2, 4, 6, 8, .... 1078, 1080 based on the previous frame's lines of 2, 4, 6, 8, .... 1078, 1080, the current frame's lines of 1, 3, 5, 7, ... 1077, and the next frame's lines of 2, 4, 6, 8, .... 1078, 1080. And for the odd frames it will fill in the info for 2, 4, 6, 8, .... 1078, 1080 using a similar technique. Yes its interpolation going on, but its not interpolating a 30FPS signal into a 60FPS progressive signal. Its interpolating a 60FPS interlaced signal into a 60FPS progressive signal. And what your eyes see on your TV is literally a 60FPS progressive signal.
Ugh. Seriously? YES. They broadcast in 60. ALL live sports are broadcast in 60 in North America. As I previously explained in the thread, even 1080i broadcasts are 60FPS, not 30.
Dude. 1080i60 gets converted to 1080p60 on the fly by all modern TVs if the content is above 30fps ... Sometimes you can see weird artifacts with fast motion but you'd have to know to look for it.
I wonder if people who always have frame interpolation turned on in their TVs will even notice a difference.
Can you source where you're getting this 30 frames per second on all broadcasts in NA from?
Frame Rate Basics
When transferring film to video, you need to take into account the differences in film and video frame rates. Film is commonly shot at 24 frames per second (fps), although 25 fps is sometimes used when the final project is to be delivered as PAL video (as opposed to the more common technique of just speeding up 24 fps film to 25 fps). Video can have a 29.97 fps rate (NTSC), a 25 fps rate (PAL), or either a 24 fps or 23.98 fps rate (24p), depending on your video standard.
The frame rate of your video (whether you sync the audio during the telecine transfer or not) and the frame rate you want to edit at can determine what you need to do to prepare your clips for editing. You may find it useful to read Determining How to Prepare Source Clips for Editing before you make any decisions about frame rates.
Working with NTSC Video
The original frame rate of NTSC video was exactly 30 fps. When color was added, the rate had to be changed slightly, to the rate of 29.97 fps. The field rate of NTSC video is 59.94 fields per second. NTSC video is often referred to as having a frame rate of 30 fps, and although the difference is not large, it cannot be ignored when transferring film to video (because of its impact on audio synchronization, explained in Synchronizing the Audio with the Video).
Another issue is how to distribute film’s 24 fps among NTSC video’s 29.97 fps.
The most common approach to distributing film’s 24 fps among NTSC video’s 29.97 fps is to perform a 3:2 pull-down (also known as a 2:3:2:3 pull-down). If you alternate recording two fields of one film frame and then three fields of the next, the 24 frames in 1 second of film end up filling the 30 frames in 1 second of video.
Note: The actual NTSC video frame rate is 29.97 fps. The film frame rate is modified to 23.98 fps in order to create the 3:2 pattern.
Your original comment here was making a firm point that all live sports in the US are broadcast at 60 fps. I never commented on the process of creating a progressive image from an interlaced one, since resolution is independent from framerate, but your explanations gave the impression that a 1080p60 video source is necessarily 60 fps (as opposed to frame doubling the source). This is what I assumed was your misunderstanding.
Your explanation here was also giving the impression that you're assuming that 60 fields per second means a signal is also 60 frames per second. This was your original post that brought up this line of questions.
In this thread, people fail to understand the difference between 60 interlaced fields per second and 30 or 60 progressive frames per second.
Broadcast HD 720p is 60 progressive frames per second, 1080i is 60 interlaced fields per second. For sports broadcasting the signal is always natively 60p or 60i in North America.
The Xbone is not doing anything unique or interesting here. You can watch the NFL every Sunday for the price of a $30 antenna because the NFL is the only professional sport still broadcasted over the air by local network affiliate stations. If you want to spend $500 for an inferior game console to watch the NFL instead of spending $400 on a PS4 which is superior as a game console and $30 on an antenna to get the same 720p or 1080i then great job there sparky. MS is over delivering on value as usual.
That's like asking if I could source where I learned that 2 + 2 equals 4 . I've just known this information for so long. Maybe its because I've dealt with several capture cards dating back to when I only had analog cable and have played around with using third party capture software like DScaler which absolutely blew away the default Haupaugge or ATI TV Wonder capture software because DScaler could both handle di-interlacing a 480i signal correctly and would have very little input lag so that I could hook my PS2 up to it to play GTA3 and MGS2 and have it be actually playable and look pretty good.
LOL at the bolded.
Not sure what you're arguing here, and no there is no frame doubling going on.
Whether a 1080i60 signal is actually 1920x1080@30hz with 2 fields per frame or 1920x540@60hz, is not really important in the argument of whether sports are already broadcast at 60FPS. They are.Well I hate to say it, but I think you're misinformed about a few things. I don't claim to be an expert on frame rates and the likes, but my experiences with this stuff leaves me believing that some of your facts are wrong.
Today's broadcast video common denominator is SDI, either SD or HD. SMPTE 294M specifies SDI at a 29.97 frame rate and a 59.94 field rate for SD and 1080i or p. HDTV in the 1080i format, as fed to and rebroadcast by most NBC, CBS and PBS affiliates, is at 59.94 fields per second. The 720p format, as specified by SMPTE 296M and used by ABC, FOX and ESPN, has no fields. It delivers 59.94 progressive frames per second. Both 1080i and 720p mathematically match downconverted SD video.
People are confusing refresh rate (field rate) and actual frame rate. In the US & Japan broadcasts have always been 30FPS, with a frequency of 60Hz, I.e. Each frame is illuminated twice. In other regions it's 25FPS (24.49ish) at a frequency of 50Hz.
It gets confusing because in the switch to HD the names 1080i60 and 1080i50 got thrown around referring to the field rate whereas the actual frame rate of these are still 30 and 25-ish respectively.
This is somewhat independent of whatever refresh rate your TV advertises itself as, which adds more confusion, not to mention all the motion enhancement stuff modern LCD's advertise themselves at with ridiculous refresh rates, that is a different measurement yet again (and I still don't really understand how Sony arrive at the numbers they use in their latest TV's, they look good though!)
You know, after more digging, I did find this article, which states that certain sports broadcasters do broadcast at 60p...
http://broadcastengineering.com/news/whats_the_difference_between_5994fps_and_60fps_07032011
In any event, I fucking hate frame rates.
Sports has always been broadcast at the highest field rate, 60i or 60p.
I'm a video editor who has worked in broadcast TV for over a decade, and I've never heard of any television station or channel accepting 720p60 or 1080p for that matter. That's not to say there aren't channels or stations that might broadcast at those rates, but the standard television frame rate in North America is 29.97 frames per second.
Here's some info on frame rates and standards from the Final Cut Pro manual:
From wikipedia...end of story
50p/60p is a progressive format and is used in high-end HDTV systems. While it is not technically part of the ATSC or DVB broadcast standards yet, reports suggest that higher progressive frame rates will be a feature of the next-generation high-definition television broadcast standards.[12] In Europe, the EBU considers 1080p50 the next step future proof system for TV broadcasts and is encouraging broadcasters to upgrade their equipment for the future.[13]
This is only in planning...they don't exist yet
NTSC doesn't have anything to do with HD broadcasts though.
Programs will be aired in the NTSC 525/29.97 drop frame format and care must be taken to ensure that all format and frame rate conversions are done properly to ensure consistent aspect ratios and timing.
Is this actually 60fps footage or is it akin to that shitty motionflow trumotion motioninmotion whatever the fuck they call it shitty interpolation
Edit: Reading a bit and it seems not? I think? This thread is pretty informative
I'm actually quite flummoxed about this 720p60 sports broadcasts. This must be an American thing... Or does anyone know if TSN or Sportsnet broadcast this? I can't seem to find anything.
? I was wondering if our sports networks broadcast in 60p is what I meant - I can't find anything about it, so I'm guessing no.
This announcement is stupid and misleading. All broadcast sports are already displayed in 60fps.
For streams at 720p (ESPN and ABC I believe -- and possibly others), they are true, progressive, no-holds-barred 60fps.
For the majority which are 1080i, the vertical resolution is effectively halved (since you are jamming two "displayed frames" into each "transmitted frame" every other scan line), but the image is interpolated by your TV each frame so it still looks pretty good. The TV is able to recover the two distinct "display frames" from the single "transmitted frame" and display them separately (each for 1/60 of a second). So in this case it is the resolution which is diminished, not the framerate (which is still true 60fps).
So the only thing that MS can be doing here that is special is to broadcast in 1080p, which would still be novel and would still improve image quality somewhat (although by far the more important issue is what bitrate they would be transmitting at), but would not affect framerate.
the NFL app actually streams games? I always just use the TV app
This junior knows his shit.
To be strictly correct, when discussing interlaced formats like 1080i60, you have to talk about something called temporal resolution. The effective resolution of 1080i varies depending on what is shown. If a still frame is being shown, resolution is actually 1080 lines, because the even and odd fields are showing the same thing. When a moving scene is shown, the effective resolution decreases as the even and odd fields do not match. In theory, if a scene is moving faster than the refresh rate of the recording camera, effective resolution can be as low as 540 lines as the even and odd fields are showing completely different things. In reality the human eye cannot truly resolve the difference in resolution that well and so on most cases it's very difficult to see the variance in effective resolution, such is why watching NBC Sunday Night Football in 1080i60 doesn't really look different at all from Fox NFL Sunday or ESPN Monday Night Football in 720p60.
This junior knows his shit.
To be strictly correct, when discussing interlaced formats like 1080i60, you have to talk about something called temporal resolution. The effective resolution of 1080i varies depending on what is shown. If a still frame is being shown, resolution is actually 1080 lines, because the even and odd fields are showing the same thing. When a moving scene is shown, the effective resolution decreases as the even and odd fields do not match. In theory, if a scene is moving faster than the refresh rate of the recording camera, effective resolution can be as low as 540 lines as the even and odd fields are showing completely different things. In reality the human eye cannot truly resolve the difference in resolution that well and so on most cases it's very difficult to see the variance in effective resolution, such is why watching NBC Sunday Night Football or the NFL on CBS in 1080i60 doesn't really look different at all from Fox NFL Sunday or ESPN Monday Night Football in 720p60.
Does anyone have an example video of a sporting event filmed at 60 FPS? I can't really grasp how moving to 60 FPS can make things much better for TV.
You mean 720p60? According to Wikipedia the two you mentioned are 1080i60. So still 60fps but with a lower effective vertical resolution.
The videos on this page http://deadspin.com/richard-sherman-breaks-up-pass-wins-game-goes-nuts-on-1504807735 are displaying at 60fps on my computer. Basically this is the same as you get on broadcast TV, though rare for internet broadcasts or even VODs.
I've already been completely wrong in this thread (but learned something at least), but according to my math, if the 60 refers to fields, and you need two fields to make a frame, then how is 60i not effectively 30p?
I get that they look similar - And I do work with 720p60 and 1080p60, which is the frame rate I prefer things recorded in when I want to slow them down.
Every piece of footage on my computer right now that has been shot in 1080i60 (which is the vast majority) all give me a native frame rate of 29.97 fps with a field dominance of top.
So what exactly did MS sign this deal to get? What is different from what they have now? Bear in mind I have never used the NFL app on the Bone.
This junior knows his shit.
To be strictly correct, when discussing interlaced formats like 1080i60, you have to talk about something called temporal resolution. The effective resolution of 1080i varies depending on what is shown. If a still frame is being shown, resolution is actually 1080 lines, because the even and odd fields are showing the same thing. When a moving scene is shown, the effective resolution decreases as the even and odd fields do not match. In theory, if a scene is moving faster than the refresh rate of the recording camera, effective resolution can be as low as 540 lines as the even and odd fields are showing completely different things. In reality the human eye cannot truly resolve the difference in resolution that well and so on most cases it's very difficult to see the variance in effective resolution, such is why watching NBC Sunday Night Football or the NFL on CBS in 1080i60 doesn't really look different at all from Fox NFL Sunday or ESPN Monday Night Football in 720p60.