AVS Forum banner
Status
Not open for further replies.
1 - 13 of 13 Posts

· Registered
Joined
·
84 Posts
Discussion Starter · #1 ·
I just had my HD cable box installed by Comcast. WOW! The HD channels are amazing. The color and depth are truely amazing on the GWII. The box that was installed has both component and DVI output. Currently I have the cable box connected via component cables. I asked the service tech about the DVI cable but they don't provide them. So my question is, is there any benefits or advantages in using the DVI input on the GWII over the component cables?


Thanks

Chris
 

· Registered
Joined
·
1,072 Posts
Just out of curiosity, where are you located? I have Comcast HD as well and we only have Component right now on the boxes.


As for your question, the DVI gets converted to analog on the GWII so that the signal can utilize the same path as the component inputs. I don't know for sure if DVI will look any different, and some have suggested in these forums that it looks identical to component. I've only seen component so I can't personally confirm that. You're in a somewhat unique position to do an A/B comparison. Care to try that?




Ron
 

· Registered
Joined
·
84 Posts
Discussion Starter · #3 ·
Ron,


I would love to do an actual A/B comparison but I don't have a DVI cable. Asked the tech for one but they mentioned Comcast doesn't provide them. If it's cheap enough I may pick one up this weekend but I have a feeling it's $50 or more for a decent cable.


This box was just installed today. I'm in Massachusetts and Comcast here just started to provide HD about a week ago, so it's probably the latest and greatest they offer. I asked and they only have one box offering at this time (here anyway). It seems like a decent one, Motorola box with Component, DVI, Optical & COAX audio, 2 rear USB port, RJ45 port and front access Smart Card slot an additional USB port. Again seems like a decent box.


Chris
 

· Registered
Joined
·
807 Posts
For what it's worth: I compared DVI and component (using a decent set of component cables from BlueJeans) on my 60" GWII for the HTPC connection and could not see any difference.


Couldn't compare the HD output since my cable box does not have DVI, only component. The computer does show flaws quite clearly though as artifacts (ghosting. unsharp etc.) are immidiatly obvious.


-Rob-
 

· Registered
Joined
·
27 Posts
According to UMR posts it appears that componet cable and Dvi on the sony route directly into the same circuitry causing a equal effect. For my future sony being delivered this week to replace my 38'' anaconda piece of crap from twweters the dvi will be a hookup for my HTPC.
 

· Registered
Joined
·
43 Posts
As a DirecTV die-hard for 7+ years, I've always felt that satellite was far superior to digital cable offerings. With DirecTV, though, I only get 4 HD channels (HDNet, HBO, Showtime, and a Pay-per-View channel) so I'm looking forward to seeing what Comcast brings to Seattle (since I only get 2 terrestrial HD channels where I live).


Ah, back on topic. I have a 50" GWII connected to a Sony SAT-HD200 using DVI. Had it connected via component for a while, but I needed the inputs so I connected the sat box via DVI because I could. I haven't done any A-B comparison tests.


One piece of advice - don't spend ~$100 for a fancy DVI cable. I used the free one that came with my Dell flat panel computer monitor and it works fine. My believe is that it's a digital signal - it either works or it doesn't, and gold plating and quad shielding isn't going to improve the signal any like it would with an analog feed. Anybody want to contest this theory?


Tom
 

· Registered
Joined
·
807 Posts
I'm in Tom's camp, and this same thing was discussed a bit in this thread . Of course, there will always be those that will claim to see a difference between their $100+ dollar DVI cable and that free one (or the $18 one from Pacific Cable). That's bound to fail in a blind test though. Save your money for some nice superbit DVDs!


As an aside: When you feed the GWII a PC signal (with sharp, well defined characters and such) it becomes clear that the GW electronics add quite a bit of junk before it's displayed. The rule of the weakest link applies, and for that reason I doubt using super-fancy (and expensive) cables has any effect on what you're going to see. That includes component cables, a decent set is probably as good as it'll get and in fact I doubt I'll even be able to see the difference between some cheap stuff from WalMart vs. my BlueJeans cables. Moral of this story: spend you money where it matters.


-Rob-
 

· Premium Member
Joined
·
10,595 Posts
Quote:
Originally posted by tlavelle
... My believe is that it's a digital signal - it either works or it doesn't, and gold plating and quad shielding isn't going to improve the signal any like it would with an analog feed. Anybody want to contest this theory?


Tom
Actually this is not true. DVI has no error correction or fault detection so it is not an all or nothing type of connection. There have been several threads showing how the picture can degrade.
 

· Registered
Joined
·
807 Posts
Hi Umr, First off, if there were medals for work above-and-beyond in the video field you'd deserve one. Thank you very much for your never-ending quest to get the best out of the GWII and your very well written instructions on tweaking it!!


I've tried to find the threads you refer to but the search engine either returns too many hits or none for all the search terms I can come up with. Would you have a link? I'd be very intersted.


The DVI specs indicate there's control data that is sent over and used by the receiver to lock on to the stream. The receiver is not supposed to display anything unless it identifies it's locked onto the data stream and remains locked. Also, the way DVI encodes pixel data means that any error is going to result in some pretty funky output values (assuming that the errors are rare enough that the receiver stays locked onto the stream). The DVI cable actually contains 4 links, one each for red, green and blue, and one more for the pixel clock. I suppose it's possible to loose a link (other than the clock) and if the display is still showing a picture you'll be missing a color. All of this should be pretty obvious in the resulting image. So, I'm scratching my head in trying to see a way how DVI is not going to be an all or nothing connection. As far as I can tell it's either going to work as intended or you're going to get a very messed up picture. Then again, I've not tried this and all I know is from reading the specs. My one DVI cable that I use seems to be doing a pretty much perfect job (the image looks great!), meaning I've not seen a DVI cable that failed. So I'd be very interesting in more information showing an "in between" the all or nothing.


-Rob-
 

· Registered
Joined
·
807 Posts
Thank you Umr!

I'll eat humble pie and admit I was wrong in thinking DVI would either work or not. Seems that the way monitors implement it they'll display bogus data if that's what's decoded by the receiver. It looks like the effect of errors starts out fairly subtle. Really a pitty to come up with a digital display standard (DVI) and not make it more robust!


Having said that, I'd still go with a cheap DVI cable for the average 10' run at intermediate resolutions. My GWII is running at a pixel clock of 77 MHz (1280x720 resolution), that's less than half of the DVI limit. Digital does have the inherent advantage over analog that it can handle lots of signal degradation and still be decoded perfectly (it's much easier to distinguish between 1 and 0 vs. analog).


-Rob-
 
1 - 13 of 13 Posts
Status
Not open for further replies.
Top