"Though a 1080i frame is presented in two fields of 540 lines each, they are staggered on the screen and do present over twice the number of pixels as 720p - PERIOD! I do not believe that your term of "apparent resolution" is legitimate, (A Google search, yields lots of entires - but none in the first 10 pages of results relates to your use (how an interlaced image relates to a progressive one). Granted, our 720p standard is superior for lots of action - hence it's choice by ESPN, but is clearly inferior for lots of detail - hence it's choice by Discovery (and HBO, Showtime, Starz, CBS, NBC, CW, TNT, etc - everyone but ESPN, ABC & Fox). What would be ideal (but I don't think that the infrastructure would support it) would be to switch the program scan mode (between 1080ix1920 and 720px1280) on a program by program basis."
Dave, thanks for trying to get a handle on this complex issue. Remember, there is no standard way to express "resolution". Is it the number of display pixels per picture on the face of the display? Is it the bandwidth of the video amp? Is the the number of vertical scan lines? Is it the horizontal pixel count? I could go on but you get my point.
On a fixed pixel display with an adequate number of fixed pixels to fully display a 1080p source, (without its fixed overscan the A2000 might fit this definition). But this ability to present an image that has the potential to display all of the resolution in a moving or static image is only true for a quality 1080p source. If the brightness is too high, or sharpness is too high, the resolution you will see will be reduced because the display will not present all the resolution the source and the pixel count can present, it is either masked distorted.
But sometimes the brain plays tricks on us, on our perception of detail. An interlaced 1080 source is an example of this. Since the two fields are set apart from each other in time, and the image scanned is slightly different in each split frame, the brain reduces the perceived detail. This is a know fact and there is a formula for this, it accounts for about a 20% reduction in our perceived detail, or the displayed apparent resolution.
Thus, if we are talking about a 720p source displayed on an ideal 720p display, compared to a 1080i source displayed on an ideal 1080i display, the apparent resolution between the two would be very close. Again the 720p source and display would be superior with moving images, and the 1080i source and display would be superior with static images. Don't get hung up on pixel counts, the two formats are different, and 720p simply does not need as many pixels to present the brain with adequate resolution.
But now we own a set that does a good job displaying 1080p source material and at converting 720p and 1080i source material to 1080p. There are so many fudge factors I can't get a handle on this any longer. When we all switch out to 120 inch displays in about 5 years we will all lust for 1080p or perhaps even 2160p source material.
My point is that both 1080i and 720p source material are very close in their ability to provide a quality HD image and it will take time to see how well the new 1080p chip sets do with such source material on a quality 1080p display. In the short time I have had to play with my 60 inch A2000 I like what I see with programs presented in both formats. With talk of 120Hz scan rate displays combined with 1080i/24 source material, on 1080p displays, 1080i may prove superior to 720p/24. Check this out next year.