I find the resolution discussion very interesting as this seems to be the ultimate gray area of HD capture, transmission, and display. As most of us are still evaluating and deciding for future equipment, I did some research about the issue.
These are the links I visited. They have very good and long explanations I will try to resume:http://www.hdlibrary.com/viewtopic.php?t=205http://www.cedmagazine.com/ced/2004/0204/02a.htmhttp://www.avd.com.au/Without_Frames...eography_1.htm
* When the western world was trying to improve the resolution of TV, Japan already had an 1125i standard. The westerners felt their companies could dissppear against the competition and convinced the decision makers that a different standard should be created. Hence came the 720p and 1080i, which are good, but not better than the Japan HDTV.
* For pure, uncompressed signals, 720p is better with moving images and 1080i is better with still images. This because more lines means more resolution, but interlacing introduces artifacts which are not present in progressive.
* 720p is cheaper to transmit than 1080i; less lines are easier to process. Because of this, some broadcasters push for 720p.
* 1080i displays are cheaper to manufacture than 720p; processors are less expensive. Because of this, distributors and some manufacturers (some buyers too) push for 1080i.
* At any given moment a 1080i shows just a bit more of the lines of a 720p, this because of the Kell Factor, which states Xi = .7Xp. In other words, 1080i video is approximately 756p video.
* Military applications demanded near absolute precision and thus progressive is favored over interlaced, because of this, contracted manufacturers push for 720p.
You can see why manufacturers, broadcasters, and cable/sat providers have been strugling for years on the format issue. As technical advances has made the less expensive interlace systems more precise, the quality gap between formats is closer. Also the Kell Factor indicates that what we actually see is closer too.
Manufacturers and cable/sat providers have been wise and offer products that work in either format, broadcasters are left to decide on which they are gonna use. It's all good for us, we still have to evaluate which product is better for our needs.
Another fact is that the signals our equipment display are processed, a lot. Even DVD video is a compressed format, further more...high definition DVD will still be a processed format too. So, it's all down to which equipment (cable transmitter, cable box, DVD player, HDTV) process these signals best.
What is better then? Well processed 1080i will look the same as 720p, that leaves us with the fact that 1080i still represents more lines. Net result should be a slightly higher resolution image which you can look closer at.
Our Toshibas take all signals, convert them to 720p, and fit them to the big screen. My feeling is that a 1080i signal will have the same effect as when you look at a 5Meg and a 3.5MEG pictures in your 17" computer monitor. Both pictures look great, but the 5Meg looks better, particularly when bigger, because it has more information.
Using the 5Meg picture analogy, a 1080i image has more information than a 720p, and it's been fitted to a big screen. Given high quality equipment, I would say 1080i should look better than 720p, for the sheer fact that it has more information. BUT (big but) ultimately the different equipment combos could prove better for either format, you'll just have to test them...as always.