There is so much time spent worrying about resolution of HDTV and what of the umpteen million HDTV resolutions will give you the best picture it's crazy considering that the picture that is heading into there can be one of umpteen million times two formats.
I'm editing in HD right now with some SD footage in the timeline along with HD footage from a Panasonic P2 camera. I'm editing at 1280-720 p at 59.94 fps.
Technically I'm dealing with a DVCPRO HD codec. Swell. Except that it isn't really 1280-720p. It is less than that with the pixels squeezed. When they are streched out they become that. To rip off wikipedia...720p is downsampled from 1280x720 to 960x720, and 1080i is downsampled from 1920x1080 to 1280x1080 for 59.94i.
A lot, a lot, a lot of HD is being shot on DVCPRO HD. So potentially, glorious HD is coming from a 960-720 source.
To top that off, if you have digital cable, they are taking that networked delivered source and compressing the hell out of it to get it to your tv. More blur. More compression artifacts. Imagine what they'll do to 1080p when it starts gobbling bandwidth.
I rendered an uncompressed 720p video today. 8 minutes long. 35 gigs.
It seems that HD is all about transcoding. Just because its final resolution is in the standard, there is no way to think that is the way it started. Lots of changes along the way.
So all the worry about resolution and 1:1 pixel mapping is fine and all but not really that important. I figure it still comes down to the electronics and software in the screen. HD is all about scaling. The better your tv scales, the better off you will be.