I did a little bit of math to try and figure out exactly what my display is doing. I came up with these figures by assuming that the xbr970 is 853 X 1,080 interlaced which seems to me to be the most accurate argument I've heard.:
So, with a resolution of 853 X 1,080 that would mean this set has a total of 921,240 pixels. But because it is interlaced only 460,620 pixels get refreshed each 1/60th of a second. A 480p signal has 345,600 pixels refreshing all the time. So 1080i on this TV would have 115,020 more refreshed pixels displayed per 1/60th of a second than 480p. If you want to count all the pixels in both fields of the 1080i signal it would have 575,640 more pixels than 480p. But I don't think that is an accurate way of looking at it. When I look at 480p and 1080i on this TV I certainly don't see over 500,000 pixels of difference, probably because a lot of those pixels are old and cause a blurring effect. The 115,020 difference seems more accurate to me and more of what I'm physically seeing. If this TV had the resolution it is suppose to, that being 1,920 X 1,080 interlaced, it would have 2,073,600 pixels interlaced and/or 1,036,800 pixels refreshed every 1/60th of a second. That would be 691,200 more refreshed pixels than 480p per 1/60th a second and that would be quite a difference and would look significantly better than 480p. But because this TV is only 853 X 1,080 it is missing 576,180 refreshed pixels per 1/60th a second of a true 1080i signal. DAMN! It is missing more of a 1080i signal than what it is displaying, which is 460,620 refreshed pixels per 1/60th a second. And so do you want to know what the difference is between the xbr970 and a 1080p flat panel when it comes to sharpness? 1080p has 1,612,980 more pixels per picture refresh than the xbr970. If you want to count both interlaced fields of the xbr970's picture rather than just 1 that would be 1,152,360 more pixels for a 1080p HDTV than the xbr970. Wow, that is a huge difference imo.