Originally Posted by cpcat
This is a common misconception with 1080p. 1080p input support
is really a separate issue from 1080p screen resolution
. 1080p24 input support in this case allows input of the native 1080p24 signal from HD-DVD/BD. It is then a relatively simple conversion for the display to 1080p60 (or 1080p72) followed by scaling to 768p. This advantage is independent of screen resolution. For this display you'll have the option of displaying at 768p60 or 768p72 (should eliminate judder).
1080p screen resolution is advantageous in the context of screen size/viewing distance. If based on this consideration you are likely to benefit, you will still benefit regardless
of input signal resolution.
Originally Posted by Influence
What really makes the difference here is what device has the better scaler: If your 720P Plasma with 1080P/24/30/60 input cabability has a better scaler (to downrez the incoming 1080P signal to it's native rez, and there will only be one "conversion" of the source signal), then it makes sense to look for 1080P input cabability. But if your HD-DVD/Blu-Ray player has better scaling abilities than your display, it might be better to just downscale (to 1080i or 720P) at your player, then pass that signal to your display (which will still need to do a little more scaling on it's own to match the native resolution of the panel, so there could be two sets of scaling going on - which may impact quality as well). Not really a clean cut answer there, you will need to test out your player/display combo to find the best settings. Also, it looks like the new HD-DVD players need to output a signal closest to their encoded format to look their best (example, encoded at 1080P, output at 1080i (or 1080P when it is available): encoded at 720P, output at 720P).
Originally Posted by cpcat
I don't think it's that complicated. Scaling should not be given the relative importance you are giving it. Deinterlacing and telecine/inverse telecine will be typically where a display shows weakness if anywhere. If a progressive display can't do a good job of scaling a progressive signal to its native resolution, I'd say get a better display and that's independent of screen resolution.
Additionally, any small differences seen on a quality display due to scaling errors alone should be captured by the screen size/viewing distance consideration.
Thanks for the interesting dialog. I have frequently pondered these issues. While I'm certain I don't have any better answers, I thought I might share my thoughts anyway. I definitely side with Cpcat in the scaling versus deinterlacing debate; as far as I can tell, deinterlacing is by far the more complicated process and concomitantly, the source of far more visible errors. That's not to suggest that scaling is irrelevant; it definitely is but that function should be squarely within the core compentency of the panel manufacturer - after all, if they don't let us feed the beast NR, it's a sure thing the signal is gonna' have to get scaled!
If you agree, we're now left with a "my deinterlacer can beat up your deinterlacer" kind of debate. Leaving the use of a third party box out of the equation, the deinterlacing hardware/algorithms employed by the source or the display should be key differentiator. Or is it?
In my mind, another major factor (and maybe the only major factor) is the number of conversions/handshakes in the process. This is really the "holy grail" and the reason all the techies continually ask whether the panel can truly accept NR (i.e. without any internal processing). In this way, the panel only needs to spit out what it receives and we can employ an external box designed exclusively for and dedicated solely to the processing of source information. Of course, we need to feed those boxes with unprocessed signals (which is a whole separate issue).
So where does this leave me? I have concluded that I "need" a display that can either (i) accept NR digitally (preferably via HDMI given the enhanced bandwith/bit depth but realistically, DVI may be the only realistic choice) and a high quality nextgen processor using SO or HQV technology (for those who care, I'm currently leaning toward the new Crystalio II) or (ii) was optimized to process/scale 480i via HDMI. Given the good track record of Fujitsu's AVM II processing and confirmation that the new PIOs will also accept 480i via HDMI, the answer to my conundrum may be forth coming. Recently, however, I have seen reports that the AVM II doesn't like 480i via HDMI as well as analog component. If the new PIOs don't do digital 480i well, I'm probably going to have to buy the commerical Panny. I'm looking for a 60" + display and while I would definitely love 1080p performance, it seems to me that accepting NR or high powered processing of digital 480i is the real issue.
Your thoughts, comments and experiences would be more than welcome!