Originally Posted by TheVehicle
I just re-read your comment about SD programming and something else occurred to me. If I decide that I prefer to not have my display deinterlace 1080i content but I find that I like the way the display upconverts 480i and 480p (i.e. SD programming) to 720p (actually 768p to be more precise), should I choose "native" and check only 720p and 480i and 480p?
normally if you allow 720p *and* 480i as supported resolutions then yes, there is a subtle difference: selecting 720p will cause the box to output 720p all the time, even when the input is 480i. selecting native will cause the box to output 480i when the input is 480i (and output 720p for any HD content).
note your TV may not actually accept 480i over HDMI - in that case you would defintiely uncheck 480i, which leaves 480p as an option: in this scenario, the STB deinterlaces and the TV scales (actually i should say signal is scaled *only* once, by the TV, since with a 1366x768 display format, all signals are always scaled *at least* once along the way to the glass unless you have a computer or video processor or something in the chain). scaling no more than once is generally good.
i just tested and found when viewing native 480i content with "native" mode selected but 480i support deselected , the STB will attempt the closest match ie 480p (not any of the remaining supported HD resolutions eg 720p).
Asking it another way, is there any practical difference between selecting 720p vs. selecting native with 720p checked off (not checking 1080i or 1080p)?
as written, this is a slightly different question: if you *only* select 720p as an allowed output resolution, then no, it seems like there would be no difference between native and 720p since you are forcing 720p out for all input resolutions either way. but maybe that's not really what you were asking about.
not sure how you determined the samsung's deinterlacing is no good. if it is actually worse than the STB, then yes, just go native with 720p and 480p support selected.