Originally Posted by Blackraven
1080i is more like a 644p from what I've heard.
Although I did wish that:
Interlace = Progressive divided by 2
Making 1080i = 540p
Things would have been more simple that way but I guess that's what electronics really is about (complex mumbo-jumbo)
A 1080i at 60 Hz source can mean 1 of 3 things:
1) A true 1080i60 source (every frame a different point in time), giving equivalent 540p at 60 Hz, with a slight half-line (or one full 1080-line) flicker up and down, which can give an illusion of 1080 resolution on static frames, but does not do well on moving items.
2) 1080p at 30 Hz source, which is regenerated by interleaving the correct frames.
3) 1080p at 24 Hz source, regenerated by interleaving the correct frames, and doing inverse 3:2 pulldown. However, most TVs don't display as a multiple of 24 Hz, but display at 60 Hz instead, with a 3:2 cadence.
Flat panels are always progressive display. If the native signal is truly 60 separate time frames, then flat panels don't do 1080i very well (#1 -- though a CRT would work great).
Even for sports, is 1080i really 60 separate frames in time? For movies or the HD Discovery channel, we actually have situation #2 or #3, which flat panels are designed to work with (though about half don't do it well). 1080i60 and the 1080p (24 or 30) signals that it represents are in fact identical, provided the TV can put the signal back together into a progressive image properly.
So, only situation 1 is a problem. In the rest of cases, 1080i and 1080p are identical. Having said that, 1080p is usually best, as there is no chance for mangling the image when puting it back together.