Quote:
Originally Posted by htwaits /forum/post/0
Your set converts all inputs to 1080p. If the input is 1080i from a good source then it will look so close to identical to a 1080p input that you won't be able to tell the difference. The difference between 1080i and 1080p input is more theoretical than it is practical.
Your set will work just fine with high definition DVD inputs using HDMI even if they are 1080i which is converted by your TV to 1080p before the images are displayed.
The point is that the Blu-Ray players will output 1080i to your HLR HDMI ports, and your set will convert it to 1080p. If you use a HDMI connection then the signal will be digital all the way from the disk to your screen.
I think an important general point, not specific to this model of Samsung, is that the difference between 1080p and 1080i is still important because the deinterlacers in most sets aren't that great yet, compared to the quality 480i deinterlacing we now have for DVDs. You will NEVER get OTA or cable in anything but 1080i or 720p, so you must have a good deinterlacer in the set for 1080i. For BluRay & HD-DVD, if the player can output native 1080p, this would be preferable to trusting the set's deinterlacer, unless it is a very good one (I believe the JVC 1080p & Sony XBRs have good ones). Otherwise you get annoying scanline twittering & moire patterns.
For an example of just how bad most deinterlacers are in 1080i to full progressive conversion, see
http://www.hometheatermag.com/hookme...ook/index.html
Now that's just one magazine testing, but the point is they used a test HD-DVD disc in an off-the-shelf player to check things like 3:2 film cadence detection, and about 80% of the sets couldn't even pass that, including some 1080P sets (this flaw is pretty common in older 720P sets too). So the original question is a valid one. There is not a "theoretical" difference between 1080i and 1080p--there is good reason to get the signal in 1080p form if possible (it wasn't in the original questioner's case, it seems) because some sets will have problems reconstructing the original frame, and some viewers will in turn be sensitive to this, while others won't notice or care. I know it bothers me when you can watch a scene change and there's glitching for about 1/2 a second in a high-contrast area (like a building outline against the sky) before the set locks on to the 3:2 cadence.
I expect eventually you'll see more extensive torture testing of sets' deinterlacers in reviews, because this problem will never go away for actually watching 1080i TV, as opposed to prerecorded movies that come straight up in progressive form.