Originally Posted by Kelson
A 1080p display simply collects both frames and deinterlaces them to construct a 1080p image comprised of the 2,073,600 original pixels (full 1080x1920 detail, not interpolated) it displays at I believe 30Hz.
It would only work that way if the original source content was 1080p/30, and I think that is rare. For content that is recorded in 1080i, the 2 fields of a frame are not captured at the same point in time, so if there is any motion between the fields then it is not possible to reconstruct a perfect 1080p frame from the 2 fields. This is where deinterlacing gets complicated.
I don't want to propagate the off-topic 720p vs 1080i discussion in this thread, so I'll just say that the answer depends on many factors such as the source material, bit rate, etc. Neither format is inherently superior to the other in all cases, that's why they both exist. 720p should not be discounted as "HD Lite", since there are cases where it is preferable to 1080i.
Of course 1080p at high bit rate is the undisputed king, but the only way to get that today is HD DVD or Blu Ray so that does not apply to this Tivo discussion.
Back on topic, I'm also interested in Hyrax's question about how well the Tivo handles scaling and/or deinterlacing. I currently have my Tivo set to output 1080i fixed to avoid resolution switching delays in my TV. In this config, the Tivo handles scaling and the TV handles deinterlacing. The main downside to this approach is that for 720p material, the Tivo has to convert it to 1080i, and the TV then converts it to 1080p. Technically I don't think the 720p-->1080i converison can be lossless. Either resolution or motion must be lost. In reality though, I have a hard time seeing a difference between that and native 720p. Switching delays make it difficult to compare though. It would be very interesting to see 2 identical Tivos and TVs side-by-side for comparison. Ideally, future Tivo boxes will offer a 1080p fixed output mode which would allow max quality for all source resolutions.