• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

I notice that like no one talks about 1080i, hell 720i doesn't seem to even exist

720p, 1080i, 1080p, etc, etc need to fucking die. If you are talking about resolution, use the actual resolution. If you don't, you're dumb and lazy and have no clue what you're talking about. Or you're trying to intentionally confuse people. PERIOD.

These p an i terms only exist to obfuscate and confuse everyone and are largely irrelevant to what people are ACTUALLY trying to convey when they talk about resolution.

There is only one sane way of talking about output and display resolution and that is in the X x Y format.

Most displays these days are either: 1366x768, 1440x900, 1920x1080. If you use any other way of representing that, you're just wrong. End of story.

Game output resolution especially! No game outputs in p's or i's.

Halo 4? 1280x720. Destiny? 1920x1080.

I wish people would fuck off with this p shorthand shit that doesn't even really mean anything anymore when people use the terms. Just be accurate and there will be no confusion. Takes 10 seconds to figure it out if you read.

Edit: if you need to explain that the signal is interlaced over progressive (hint: you don't) then just say it in XY and add the extra info afterwards.
 
1080i has the same resolution as 1080p. It has inferior temporal resolution and can display issues with fast moving subjects.

1080i is temporal compression. A still image 1080i is the same as 1080p.

720p is spational compression. It will not display issues with fast motion but will have much less resolution than 1080i.

For this reason, 720p is superior for sports where the eye is following a fast moving object such as a hockey puck. In this instance spatial compression is preferred and temporal compressed is inferior.

Most stations broadcast in 1080i because it uses about the same bandwidth as 720p but delivers twice the resolution. For dramas and comedies and many movies, it is the best solution.

1080i, again, has an advantage on most sets which display 1080p natively as there is no scaling involved. A good quality deinterlacer will provide excellent picture quality on all but the fastest moving objects.

Obviously 1080p is preferable to either but with the realities of bandwidth limitations, these two standards offer solutions to the problem by offering either temporal or spatial compression. Each has its own sets of pros and cons.

Anyone who says 1080i is "bad" outs themselves as misinformed...to be polite about it.
 
I don't think any TVs that only do 1080i have come out since like 2006

You'd be surprised.

Wal-Mart Black Friday TVs can be laughably shitty. Even as recent as 4 years ago, their cheapest big-ass TVs on Black Friday were 1080i/480p only.

When the PS3 launched, there were a ton of people that discovered their TV couldn't actually do 720p at all.
 
1080i is useless for consumers. 720p is the superior format at the same transmission bandwidth.

There were exactly two applications for 1080i
  • dumping loads of old stock of technically SD CRTs onto markets that didn't have / didn't care about minimum standards for things to be called "HDTV"
  • talking point for Xbox fans
 
1080i has the same resolution as 1080p. It has inferior temporal resolution and can display issues with fast moving subjects.

1080i is temporal compression. A still image 1080i is the same as 1080p.

720p is spational compression. It will not display issues with fast motion but will have much less resolution than 1080i.

For this reason, 720p is superior for sports where the eye is following a fast moving object such as a hockey puck. In this instance spatial compression is preferred and temporal compressed is inferior.

Most stations broadcast in 1080i because it uses about the same bandwidth as 720p but delivers twice the resolution. For dramas and comedies and many movies, it is the best solution.

1080i, again, has an advantage on most sets which display 1080p natively as there is no scaling involved. A good quality deinterlacer will provide excellent picture quality on all but the fastest moving objects.

Obviously 1080p is preferable to either but with the realities of bandwidth limitations, these two standards offer solutions to the problem by offering either temporal or spatial compression. Each has its own sets of pros and cons.

Anyone who says 1080i is "bad" outs themselves as misinformed...to be polite about it.

I've been wondering for a while if 1080i is better than 720p for the Vita TV. It's upscaling from an internal resolution of 960x544, which means that practically the entire frame would be shoved through at 1080i, with minimal scaling artifacts. On the other hand, it's still being displayed as interlaced, which I suppose would affect motion. If only 540p was an option...

Edit: That or, if only Vita TV supported 1080p. I'm guessing there's some hardware limitation on that.
 
I don't think any TVs that only do 1080i have come out since like 2006

Heck, there were televisions being sold during the 360's run that had component hookups but were 480i. Both of my brothers had a TV like that. One friend didn't believe me until I showed him the TV in person.

A large chunk of TV sales are driven by sports broadcasts, and sports broadcast in 720p.
 
I've been wondering for a while if 1080i is better than 720p for the Vita TV. It's upscaling from an internal resolution of 960x544, which means that practically the entire frame would be shoved through at 1080i, with minimal scaling artifacts. On the other hand, it's still being displayed as interlaced, which I suppose would affect motion. If only 540p was an option...

In my opinion, 720p would be superior for this application.

Either resolution would involve scaling and many of the games will feature fast motion.

You're deciding between temporal or spatial compression. There is fast motion so temporal compression is bad for this application. And really there is no spatial compression since 720p is already a higher resolution than the vita renders at natively.

So what you're really deciding between is temporal compression vs no compression. Both will involve scaling.

The answer is obvious: no scaling. For PS TV, 720p is the answer.

For most broadcast media aside from sports, 1080p is usually the answer.
 
In my opinion, 720p would be superior for this application.

Either resolution would involve scaling and many of the games will feature fast motion.

You're deciding between temporal or spatial compression. There is fast motion so temporal compression is bad for this application. And really there is no spatial compression since 720p is already a higher resolution than the vita renders at natively.

So what you're really deciding between is temporal compression vs no compression. Both will involve scaling.

The answer is obvious: no scaling. For PS TV, 720p is the answer.

For most broadcast media aside from sports, 1080p is usually the answer.

Both involve scaling, yes, but in this case 1920x1080 allows for integer scaling which (ideally) should not have artifacts. I suppose 720p won't be losing any pixel information, though.
 
Under the correct circumstances 1080I is just as good as 1080p.

It's still 60htz
It still offers 1080p resolution
60 interlaced fields = 30 full frames

The deinterlacer determines how well the image is deinterlaced, or how noticeable flicker is.

FYI, crt tv's can do more than just 720p or 1080I. Good ones, like the old xbrs, can do other progressive scan resolutions like 1792x1008 or 1840x1045 or even 900p perfectly with no scaling and generally show far less jaggies, and can offer a sharper picture, contrary to popular belief.
 
FYI, crt tv's can do more than just 720p or 1080I. Good ones, like the old xbrs, can do other progressive scan resolutions like 1792x1008 or 1840x1045 or even 900p perfectly with no scaling and generally show far less jaggies, and can offer a sharper picture, contrary to popular belief.

CRTs are indeed quite flexible, it's mostly the weight that killed them off..
 
My CRT Rear-pro had a brilliant deinterlacer so 1080i always looked better than 720p. People saying 720p looks better obviously have sets with cheap-ass deinterlacing components.
 
Heck, there were televisions being sold during the 360's run that had component hookups but were 480i. Both of my brothers had a TV like that. One friend didn't believe me until I showed him the TV in person.

A large chunk of TV sales are driven by sports broadcasts, and sports broadcast in 720p.

I had a 480i TV with component inputs. I noticed a difference. But then again, I was playing my PS2 through a VCR to a TV with UHF and VHF nobs on it before that. Hell that TV was the first TV in my house to have stereo sound and this was 2006.

I then bought a 1080i CRT 6 months later.
 
1080i was terrible for gaming. Since consoles operate at 60hz the solution for outputting 1080i60 was to scale each 1280x720 frame to a 1920x540 field. The HDTV would then take this 1920x540 field and run it through it's deinterlacer which was designed for converting 1080i60 to 1080p30. This would mean that most people were looking at an effective 1280x540 image at 30fps that was the combination of 2 screens that didn't necessarily have much to do with each other. Lots of scan line artifacts. Some LCD HDTVs just displayed each field by converting each 1920x540 field into a 1920x1080 frame, but you still suffered a resolution loss going 720p -> 540p -> 1080p. The only situation where 1080i even began to make sense was those using CRTs with a native 1080i scan rate. Then it only made sense because there was a small chance that the console would do a better 720p60 conversion then your TV. 720p60 was the best option for gaming until 1080p upscaling was standard on consoles.

We can gladly put 1080i to rest as it's completely unnecessary these days.
 
Forgot about 1080i until I went to visit my parents last week. I had just helped them pick up a reasonable LG tv in the past month and was shocked by how horrible it looked. After I went through adjusting the ridiculous color settings it came with, I still noticed the picture quality was horrible.

That was when I realized that because I don't watch tv ever it was the Dish Network receiver only outputting at 1080i. Because all my parents watch is sports, I ended up knocking it down to 720p.
 
What does 1080i have to do with Xbox? Even the first Xbox did 720p games quite well.
Might be about how the 360 can scale to different resolutions easily. Our 360 looked really good with nearly every game on our old tv because it could do 1080i fine and feed that to our 480p/1080i tv whereas our ps3 would only do 1080i if the game supported it, and drop down to 480p for unsupported ones.
 
I had a 480i TV with component inputs. I noticed a difference. But then again, I was playing my PS2 through a VCR to a TV with UHF and VHF nobs on it before that. Hell that TV was the first TV in my house to have stereo sound and this was 2006.

I then bought a 1080i CRT 6 months later.

We rocked a black and white TV up until a bit of the ways into the SNES era

We rented Bart vs the Space Mutants and couldn't beat it since we couldn't see which objects we were supposed to spraypaint..
 
We rocked a black and white TV up until a bit of the ways into the SNES era

We rented Bart vs the Space Mutants and couldn't beat it since we couldn't see which objects we were supposed to spraypaint..

I played Sonic Jam on the Game.com and had to collect the blue spheres.
 
Interlaced is disgusting, don't even do it

My Sony 34XBR960 disagrees. Best non-4K HD picture you can get with practically zero input lag and deep inky black levels.

The only time I notice any interlacing is on the Wii U plaza: sometimes a Miiverse post will pop up with tiny horizontal black lines that will wink in and out as it passes through the interlacing. For everything else it looks practically flawless, sans a slight convergence error in my lower left corner.

EDIT: Holy shit at all of the misinformation in this thread. If you're going to be a resolution snob at least learn what the fuck you're talking about, Jesus H.
 
My Sony 34XBR960 disagrees. Best non-4K HD picture you can get with practically zero input lag and deep inky black levels.

The only time I notice any interlacing is on the Wii U plaza: sometimes a Miiverse post will pop up with tiny horizontal black lines that will wink in and out as it passes through the interlacing. For everything else it looks practically flawless, sans a slight convergence error in my lower left corner.

EDIT: Holy shit at all of the misinformation in this thread. If you're going to be a resolution snob at least learn what the fuck you're talking about, Jesus H.

It looks fantastic because it's a CRT not because it's interlaced. I had a 40XBR450 back in the day. I also had Sony CRT monitors and a progressive image on a CRT is tremendously better then an interlaced one. Your post basically amounts to "CRTs look awesome. Mine being interlaced doesn't have anything to do with how great it looks, but I'm going to act like you are spreading misinformation anyway."

Interlacing an image is a solution for trying to get more apparent detail in an image when you are bandwidth constrained. The cost of interlacing is of course interlacing artifacts which you describe your display as suffering from in your post.
 
720p was originally favored for action-y content like sports, interlacing loses more detail and judders more across pans and zooms than progressive

1080i was preferred for broadcasting general content because it takes less bandwidth
 
It looks fantastic because it's a CRT not because it's interlaced. I had a 40XBR450 back in the day. I also had Sony CRT monitors and a progressive image on a CRT is tremendously better then an interlaced one. Your post basically amounts to "CRTs look awesome. Mine being interlaced doesn't have anything to do with how great it looks, but I'm going to act like you are spreading misinformation anyway."

Interlacing an image is a solution for trying to get more apparent detail in an image when you are bandwidth starved. The cost of interlacing is of course interlacing artifacts which you describe your display as suffering from in your post.

It looks fantastic because I'm running the native resolution (1080i) for my display. Any progressive signal (480p, 720p) that I put into my TV gets scaled to 1080i anyway and introduces scaling errors/jaggies/etc., so setting the device to output 1080i right away will always yield the best results. "Use the native resolution for your display" is like home AV 101...are you sure you know what you're talking about?

The misinformation that I was talking about is all the posts saying "lol interlacing, never". On most modern digital displays a 1080i signal will indeed look like shit and 720p/1080p is the way to go. But if you did your research and got a HDTV that's actually built for gaming (read: a HD CRT with minimal/no input lag, no post processing, good contrast/black levels, etc.), 1080i is the best option.
 
720p was originally favored for action-y content like sports, interlacing loses more detail and judders more across pans and zooms than progressive

1080i was preferred for broadcasting general content because it takes less bandwidth

720p was intended for any kind of fast action because using 1080i at a full 60 unique fields per second means you end up with combing.

see: http://en.wikipedia.org/wiki/Interlaced_video#Interlacing_problems

It looks fantastic because I'm running the native resolution (1080i) for my display. Any progressive signal (480p, 720p) that I put into my TV gets scaled to 1080i anyway and introduces scaling errors/jaggies/etc., so setting the device to output 1080i right away will always yield the best results. "Use the native resolution for your display" is like home AV 101...are you sure you know what you're talking about?

The misinformation that I was talking about is all the posts saying "lol interlacing, never". On most modern digital displays a 1080i signal will indeed look like shit and 720p/1080p is the way to go. But if you did your research and got a HDTV that's actually built for gaming (read: a HD CRT with minimal/no input lag, no post processing, good contrast/black levels, etc.), 1080i is the best option.

Your TV is limited by the fact it's maximum scan rate is 1080i. You keep claiming the broad benefits of using a CRT without realizing that you are limited by the fact you are limited to an interlaced signal. If your HDTV could display a true 1080p image it would look tremendously better. It would be much brighter as the phosphurs would be stimulated at 1080 lines 60 times per second as opposed to 540 lines at 60 times per second. You would also avoiding combing and other interlacing artifacts. Your TV relies an a 3D comb filter to minimize interlacing artifacts since they are inherent to the incoming signal. I realize your TV looks great, but you are only championing 1080i because your TV is limited by it.

No.

You get 1080 but you get it in two interlaced fields. You lose temporally not resolution.

Not quite. What you get is 60 1080x540 fields with every other field offset on the vertical axis. This attempts to get you the benefits of 1080 lines of resolution while also giving you the benefit of 60 fps. On a CRT this effect is somewhat realized with the result of interlacing artifacts. When moving to a fixed pixel display it completely falls apart. You get to either pick 1080p30 (after deinterlacing) or 1920x540p60 (which is then scaled to 1920x1080p60). You can't get more out of the source signal then is contained in it.
 
I had a 480i TV with component inputs. I noticed a difference. But then again, I was playing my PS2 through a VCR to a TV with UHF and VHF nobs on it before that. Hell that TV was the first TV in my house to have stereo sound and this was 2006.

I then bought a 1080i CRT 6 months later.

My last set was a 480i CRT with component hookups. Post digital conversion, I don't care what anyone says, that thing looked nice. Granted, my HD set is worlds better, but SDTV gets a worse rap than it deserves because of source issues. In terms of signal quality, it went something like OTA > component >>>> composite >>> cable box over composite.
 
Nintendo uses interlacing for their handhelds, while Sony uses progressive displays.

Um

9ZAJTBb.jpg


I wouldn't say that. Especially since the PSP-3000 is the only handheld I've seen that uses interlacing. I haven't played the 3DSXL or 2DS though, do these have interlaced displays perhaps?
 
Might be about how the 360 can scale to different resolutions easily. Our 360 looked really good with nearly every game on our old tv because it could do 1080i fine and feed that to our 480p/1080i tv whereas our ps3 would only do 1080i if the game supported it, and drop down to 480p for unsupported ones.
Ah yeah that makes sense. I also enjoyed both the Xbox and Xbox 360's VGA hookups. The 360 was amazing as it supported a ton of PC resolutions.
 
Top Bottom