Originally Posted by Bram
What I am rather certain about, is that a 1080i image source is the main bottleneck. From the marketing department point of view, it's a minor detail to convert 1080i sources to display perfectly on an 1080p panel. If you ask the engineers, they will tell you that 1080i is very close to 540p, and not at all anything like 1080p. Displays treat the 1080i source very similar to a 540p source, because it's not much more that can be done...
I believe you are mistaken. The only way 1080i is close to 540p is in bandwidth, NOT in the actual data or what is displayed. Progressive displays (LCDs and Plasmas) treat 1080i and 540p signals differently. 1080i has 1080 lines in its frame, separated into two groups of 540 lines called a field. It is a frame that is displayed, not a field. If a set properly deinterlaces, a 1080/60i field (540 lines) is held in memory until the next field is received. These two fields are weaved/combined, corrected for motion and then displayed. The result is 1080/30p. 540p has 540 lines in its frame. If you receive a 540/60p signal, the set would uprez the signal to fit your display (line doubling or some other method of creating missing lines) and display that 'softer' picture. Proper deinterlacing of 1080i results with 1920 lines of actual data, not 540 lines uprezed to 1080.
Originally Posted by Bram
In other words, a 720p source contains higher resolution than a 1080i source, and a 768p display have better native pixel resolution than both those sources. That means 1080p displays won't help at all for these sources. Moving closer to the screen, blur or artifacts introduced by image scaling algorithms will be visible long before you start noticing the pixel grid.
There is absolutely no way a 720p source contains highter resolution than a 1080i source. The resolution of a 720p input is 1280x720, the resolution of a 1080i input is 1920x1080. You seem to be confused with display rate. A 720p source will display 720 lines (a frame) 60 times a second. A 1080i source will display 1080 lines (a frame) 30 times a second.
A 768p set can not have "better native pixel resolution" than a 1080p display. Native resolution refers to the number of lines display on a panel. The last time I checked, 1080 lines is more than 768 lines. A 720p source does not have higher resolution (same as above - number of lines) than a properly deinterlaced 1080i signal. When properly deinterlaced, a 1080i input provides 1920x1080 lines of actual data (on a 1080p panel). A 720p signal (on a 1080p display) will need to 'add' lines to make up the difference between 1280x720 and 1920x1080.
Remember, the majority of HDTV is broadcast in 1080i. A 720p/768p display will have to deinterlace and scale these inputs. With good sets, receiving a good signal, blur and compression artifacts are not significant issues.