AVS Forum banner
1 - 19 of 19 Posts

·
Registered
Joined
·
592 Posts
Discussion Starter · #1 ·
I have the Hlr-6178 that will display 1080p through the VGA input.


When I bought this, I wasnt even thinking about which input it would accept 1080p on, so now that 1080p is actually here, am I screwed?

Are there any Blue ray disc players that have VGA output?


Is a HTPC the only way to get 1080p on this TV.
 

·
Registered
Joined
·
592 Posts
Discussion Starter · #2 ·
I have been away from this forum for a few months, and I may have misunderstood, so please enlighten me. I thought that HD DVD was going to be in 1080i, and now I am reading that it will support 1080p.


I also see the xbox 360 will do 1080p and work with the samsung tv i mentioned above.


After reading some stuff on the ps3, it doesnt appear that is a viable 1080p option for owners of the Hlrxx78 models??
 

·
Registered
Joined
·
1,369 Posts
It will accept 1080p through HDMI as well not only VGA .... why are you concerned about VGA so much? There are no players with a VGA output.


HD-DVD is 1080p but the players output 1080i except for the new Toshiba player. The PS3 outputs 1080p over HDMI. Its in the PS3 spec on any listing page.


When you purchase the TV it should say that it accepts 1080p over HDMI .. look in the manual.
 

·
Coyote Waits
Joined
·
27,308 Posts

Quote:
Originally Posted by Management /forum/post/0


When you purchase the TV it should say that it accepts 1080p over HDMI .. look in the manual.
hoopsrgreat has a 2005 HLR TV which accepts 1080i not 1080p through it's HDMI ports.

hoopsrgreat: Don't worry. Your set is fine to use with either Blu-Ray or HD-DVD players. Both types of high definition disks store movies at [email protected] The HD-DVD players on the market now convert [email protected] to [email protected] before the signal is output. The Blu-Ray players have an option to do the same thing.


Check with gamers, but my guess is that the PS3 and the Xbox will work fine with either type disk.
 

·
Registered
Joined
·
592 Posts
Discussion Starter · #6 ·
Not sure If I am getting this.


I want to know the best way of getting 1080p on this set. I know I can get 1080i through either blue ray or HD DVD and go into the HDMI input, but what about 1080p?


Seems simple enough on the xbox360 as it is now outputting 1080p through vga if I am understanding what I was reading.


Dont think it is possible at the moment on the blue ray players.
 

·
Coyote Waits
Joined
·
27,308 Posts

Quote:
Originally Posted by hoopsrgreat /forum/post/0


Not sure If I am getting this.

Your set converts all inputs to 1080p. If the input is 1080i from a good source then it will look so close to identical to a 1080p input that you won't be able to tell the difference. The difference between 1080i and 1080p input is more theoretical than it is practical.


Your set will work just fine with high definition DVD inputs using HDMI even if they are 1080i which is converted by your TV to 1080p before the images are displayed.

Quote:
Dont think it is possible at the moment on the blue ray players

The point is that the Blu-Ray players will output 1080i to your HLR HDMI ports, and your set will convert it to 1080p. If you use a HDMI connection then the signal will be digital all the way from the disk to your screen.
 

·
Registered
Joined
·
224 Posts

Quote:
Originally Posted by htwaits /forum/post/0


Your set converts all inputs to 1080p. If the input is 1080i from a good source then it will look so close to identical to a 1080p input that you won't be able to tell the difference. The difference between 1080i and 1080p input is more theoretical than it is practical.


Your set will work just fine with high definition DVD inputs using HDMI even if they are 1080i which is converted by your TV to 1080p before the images are displayed.


The point is that the Blu-Ray players will output 1080i to your HLR HDMI ports, and your set will convert it to 1080p. If you use a HDMI connection then the signal will be digital all the way from the disk to your screen.

I think an important general point, not specific to this model of Samsung, is that the difference between 1080p and 1080i is still important because the deinterlacers in most sets aren't that great yet, compared to the quality 480i deinterlacing we now have for DVDs. You will NEVER get OTA or cable in anything but 1080i or 720p, so you must have a good deinterlacer in the set for 1080i. For BluRay & HD-DVD, if the player can output native 1080p, this would be preferable to trusting the set's deinterlacer, unless it is a very good one (I believe the JVC 1080p & Sony XBRs have good ones). Otherwise you get annoying scanline twittering & moire patterns.


For an example of just how bad most deinterlacers are in 1080i to full progressive conversion, see
http://www.hometheatermag.com/hookme...ook/index.html


Now that's just one magazine testing, but the point is they used a test HD-DVD disc in an off-the-shelf player to check things like 3:2 film cadence detection, and about 80% of the sets couldn't even pass that, including some 1080P sets (this flaw is pretty common in older 720P sets too). So the original question is a valid one. There is not a "theoretical" difference between 1080i and 1080p--there is good reason to get the signal in 1080p form if possible (it wasn't in the original questioner's case, it seems) because some sets will have problems reconstructing the original frame, and some viewers will in turn be sensitive to this, while others won't notice or care. I know it bothers me when you can watch a scene change and there's glitching for about 1/2 a second in a high-contrast area (like a building outline against the sky) before the set locks on to the 3:2 cadence.


I expect eventually you'll see more extensive torture testing of sets' deinterlacers in reviews, because this problem will never go away for actually watching 1080i TV, as opposed to prerecorded movies that come straight up in progressive form.
 

·
Coyote Waits
Joined
·
27,308 Posts

Quote:
Originally Posted by ranger999 /forum/post/0


I think an important general point, not specific to this model of Samsung, is that the difference between 1080p and 1080i is still important because the deinterlacers in most sets aren't that great yet ...

I agree, but hoopsrgreat was concerned that he had been "screwed" because he bought his TV before 1080p inputs for other than VGA were available.


As you point out, there is a very good chance that he wouldn't detect a significant difference connecting either Blu-Ray or HD-DVD players to his set as apposed to a 2006 model with 1080p inputs.


I'm not sure how Home Theater Magazine's analysis transfers to the experience of the vast majority of 1080p TV owners. Many times what is seen in a test pattern just doesn't become apparent viewing normal material. Then there is the unfortunate fact that much "normal" material is full of it's own flaws.
 

·
Registered
Joined
·
271 Posts

Quote:
Originally Posted by hoopsrgreat /forum/post/0


I have the Hlr-6178 that will display 1080p through the VGA input.


When I bought this, I wasnt even thinking about which input it would accept 1080p on, so now that 1080p is actually here, am I screwed?

Are there any Blue ray disc players that have VGA output?


Is a HTPC the only way to get 1080p on this TV.

hoopsrgreat,


I think I read somewhere on this forum that, if you have a native progressive display like 720P or 1080P, you should feed them with progressive inputs like 720P or 1080P. So since your display does not accept 1080P, you should feed it the next best input which in this case is 720P. The logic is that the display will just upscale the 720P input to 1080P native. The PQ however, will depend on how good the dvd player is. I will like for someone to correct me on this.
 

·
Registered
Joined
·
16,166 Posts

Quote:
Originally Posted by fubdap /forum/post/0


I think I read somewhere on this forum that, if you have a native progressive display like 720P or 1080P, you should feed them with progressive inputs like 720P or 1080P. So since your display does not accept 1080P, you should feed it the next best input which in this case is 720P. The logic is that the display will just upscale the 720P input to 1080P native.

If your display has a halfway decent deinterlacer, then feeding it a 1080i signal (which the display just has to deinterlace to 1080p) should give you the same PQ as feeding it 1080p. Of course, all depends on the specific equipment, so you should compare 720p versus 1080i input to see which you prefer.
 

·
Coyote Waits
Joined
·
27,308 Posts

Quote:
Originally Posted by fubdap /forum/post/0


I will like for someone to correct me on this.

BillP's correction works for me too.
 

·
Registered
Joined
·
271 Posts

Quote:
Originally Posted by BillP /forum/post/0


If your display has a halfway decent deinterlacer, then feeding it a 1080i signal (which the display just has to deinterlace to 1080p) should give you the same PQ as feeding it 1080p. Of course, all depends on the specific equipment, so you should compare 720p versus 1080i input to see which you prefer.

Hi BillP


See this post about Best Output Resolution for a 1080p TV. http://www.avsforum.com/avs-vb/showt...&post7407251


Reading through the post made sense to me. What do you think?
 

·
Registered
Joined
·
16,166 Posts
fubdap, that post is talking about scaling a 480i signal (SD DVD) to either 720p or 1080i, and which might be theoretically better. There are arguments both ways, and of course, the best answer is to try it both ways to see which looks best with your particular equipment. But that is very different from BD or HD-DVD, which was mentioned in the OP's original post. Both BD and HD start with a 1080p signal (not 480i), and deinterlace it to 1080i. Then you have the option of outputting 1080i, 1080p, or 720p. For a 1080p display, whether you output 1080i or 1080p should make no difference since either way, the signal has to be deinterlaced to 1080p (by either the player or the display) -- I'm not aware of any BD or HD player that can simply output the 1080p signal without interlacing and then deinterlacing it back again. Outputting 720p to a 1080p display makes no sense at all with BD or HD-DVD.
 

·
Registered
Joined
·
271 Posts

Quote:
Originally Posted by BillP /forum/post/0


fubdap, that post is talking about scaling a 480i signal (SD DVD) to either 720p or 1080i, and which might be theoretically better. There are arguments both ways, and of course, the best answer is to try it both ways to see which looks best with your particular equipment. But that is very different from BD or HD-DVD, which was mentioned in the OP's original post. Both BD and HD start with a 1080p signal (not 480i), and deinterlace it to 1080i. Then you have the option of outputting 1080i, 1080p, or 720p. For a 1080p display, whether you output 1080i or 1080p should make no difference since either way, the signal has to be deinterlaced to 1080p (by either the player or the display) -- I'm not aware of any BD or HD player that can simply output the 1080p signal without interlacing and then deinterlacing it back again. Outputting 720p to a 1080p display makes no sense at all with BD or HD-DVD.


BillP,


Thanks for the clarification.
 

·
Registered
Joined
·
224 Posts

Quote:
Originally Posted by htwaits /forum/post/0


I agree, but hoopsrgreat was concerned that he had been "screwed" because he bought his TV before 1080p inputs for other than VGA were available.


As you point out, there is a very good chance that he wouldn't detect a significant difference connecting either Blu-Ray or HD-DVD players to his set as apposed to a 2006 model with 1080p inputs.


I'm not sure how Home Theater Magazine's analysis transfers to the experience of the vast majority of 1080p TV owners. Many times what is seen in a test pattern just doesn't become apparent viewing normal material. Then there is the unfortunate fact that much "normal" material is full of it's own flaws.

I agree that many people won't see it. And if editing is done on field-basis, you'll get bobbed frames and other artifacts from failure to preserve the 3:2 cadence (ever watch DVDs with 24P cartoons edited by 60i video equipment, like Futurama region 1 DVDs?).


On the other hand, I've read that the stairs at the start of ch. 8 of the Mission Impossible BluRay produces visible moire patterns & moving lines with 1080i outputs with many HDTVs, whereas 1080p straight from the player looks perfect on all TVs CNet tested, which suggests that the effect is not purely theoretical.
 

·
Registered
Joined
·
94 Posts
hoopsrgreat: I think you may be a little too hung up on 1080p through the VGA port. I have connected my computer (which has DVI output [digital] but must at some point be converted to VGA [analog]) to the newer 1080p models and I can tell you that movies output at 720p look way better than 1080p over VGA. I believe the reason they made the VGA port accept 1080p is so computers wont look fuzzy. I once hooked up my computer to a 1080i CRT tv and if was AWEFUL! I could barely read anything even when I had my resolution set at 1024 x 768.


If you can REALLLLY tell the difference between 1080p and 720p, then you need to buy a new TV that accepts 1080p through HDMI, so you can view the PS3 or XBOX360 in 1080p over a digital connection........it will make a world of a difference between 1080p over VGA vs. 1080p over HDMI in my opinion.
 

·
Registered
Joined
·
3,297 Posts
That hometheatermag.com link is deinterlacing test part 2. Go find part 1 which tests last years samsung and mitsubishi dlp and neither deinterlace 1080i to 1080p properly. They are really doing 540p. This year they corrected (improved) that function. This magazine (hdguru) exposed them. This may not have a huge visual effect but is not desired never the less. Better yet look at this years sony lcos xbr and it didn't do anything properly except in footnote settings (don't understand those). As ranger999 points out, they still don't do the 3:2 pulldown properly this year and he noticed it.


My friend has last years Mitsubishi 62" dlp and I suggested he try the 720p output instead of 1080i from hd cable and dvd. He couldn't tell any difference. Rightfully so he is very disappointed that his new tv won't do 1080p in when they call it a 1080p tv. His picture is probably softer than this year's models.
 

·
Registered
Joined
·
224 Posts

Quote:
Originally Posted by BillP /forum/post/0


fubdap, that post is talking about scaling a 480i signal (SD DVD) to either 720p or 1080i, and which might be theoretically better. There are arguments both ways, and of course, the best answer is to try it both ways to see which looks best with your particular equipment. But that is very different from BD or HD-DVD, which was mentioned in the OP's original post. Both BD and HD start with a 1080p signal (not 480i), and deinterlace it to 1080i. Then you have the option of outputting 1080i, 1080p, or 720p. For a 1080p display, whether you output 1080i or 1080p should make no difference since either way, the signal has to be deinterlaced to 1080p (by either the player or the display) -- I'm not aware of any BD or HD player that can simply output the 1080p signal without interlacing and then deinterlacing it back again. Outputting 720p to a 1080p display makes no sense at all with BD or HD-DVD.

Can you clarify this--I thought the new 2nd generation players could output straight 1080P60 or P24 from the frame buffer. For those of you unfamiliar with the context, some of the first gen players actually produced 1080i internally, then a deinterlacing stage had to produce 1080P from it, which is just crazy (there was no all-in-one chip solution that could produce 1080P using the required codecs, if I remember, which is why you have this conceptual monstrosity).


I'm pretty sure the Playstation 3 can output 1080P60 native, without internal deinterlacing. Correct me if I'm wrong. Of course the disc has to be mastered in progressive mode too, but I assume newer discs would be doing this.
 
1 - 19 of 19 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top