AVS Forum banner
1 - 6 of 6 Posts

·
Registered
Joined
·
137 Posts
Discussion Starter · #1 ·
I have built an HTPC, which uses Nvidia 7600GS card. TV is a Toshiba 72MX196 (72" 1080p DLP).


After extensive testing and back and forths, I have concluded that using HDTV out on this card (which connects to Component In of TV) is far superior to using the (very tempting) DVI-->HDMI link. Believe me, I did try to calibrate TV for each input using all the reference images and "guides" I could get my hands on. I even have a DVD that walks you thru calibrating your tv using its controls.


1. Desktop text and icons are the only two things which look better with DVI

2. Colors thru HDTV Out are richer and deeper than DVI.

3. Video (moving images: movies, TV [thru ATI Elite], etc) is richer and *more* detailed on HDTV Out. DVI has a washed out or slightly out-of-focus look. Doesn't look as detailed.

4. This is very annoying with DVI: When you hook up your TV to the HTPC using DVI-->HDMI connection, image adjustment controls (which let you move your image up/down/left/right so that it's centred) are gone! I have seen this when using DVI to hook up to my desktop NEC monitors, but it was never an issue. But with TV's it is!!! I have read quite a few forums, and the missing controls (in DVI mode) to centre image on TV's seems to be haunting a lot of people with various makes and models of HDTV's.


It could be that I am missing something, but based on my experience I would recommend that stick with the HDTV Out port Nvidia has provided on the card. It is there for a reason. DVI output is NOT meant for TV's, and it shows. May be that some of the artifacts I saw won't be a big deal or too obvious on smaller screen TV's, but on a large TV, it is VERY obvious!.


By the way, in case someone is wondering, my HTPC is Core 2 Duo E6300, with Windows MCE 2005. I am also running Nvidia PureVideo. And I used both classic and modern interfaces of NVidia control centre. And I did play with *every* fricken setting it had to offer...


thx.
 

·
Registered
Joined
·
389 Posts
Probably should have changed your title. It isn't a universal truth. I happen to have an Nvidia 7950GX2 connected to a Sony Pearl on a 136" screen. One of the best images I have seen. Going to Component skews the color accuracy for me. Also you can calibrate your overall color space using even low end tools like Spyder2 Pro or SpyderTV Pro. You can also tweak your viewing preferences in your player if movies are your problem.


Not saying your post isn't valid, just saying it is a data point to consider. Every TV is going to be different with how well they do each individual connector type. The take-away should be try both and see what works for you. Individual mileage may vary type of thing.
 

·
Registered
Joined
·
370 Posts
Also, I'll add:


Regarding centering, you can use your television's Service Menu (not the user menu) to adjust the horizontal and vertical centering. If your display is not centered properly using DVI, it is because they did a half-assed job of setting this up at the factory (they usually do). Check the "display devices" forums, and look for the thread specific to your make/model to find out how to access the Service Menu.


Also, I have found that my last Toshiba DLP did not allow for full PC colorspace levels, forcing me to use the nvidia control panel color correction to compensate.

My new Samsung DLP allows for the full range of 0-255 PC colorspace without having to compress the output from the PC, resulting in a much better picture.

No more banding!
 

·
Registered
Joined
·
23,131 Posts

Quote:
Originally Posted by badshah2000 /forum/post/0


I have built an HTPC, which uses Nvidia 7600GS card. TV is a Toshiba 72MX196 (72" 1080p DLP).


After extensive testing and back and forths, I have concluded that using HDTV out on this card (which connects to Component In of TV) is far superior to using the (very tempting) DVI-->HDMI link. Believe me, I did try to calibrate TV for each input using all the reference images and "guides" I could get my hands on. I even have a DVD that walks you thru calibrating your tv using its controls.

That's a problem with your TV, not your nVidia card or the HDMI link. I run DVI-HDMI to my IN76 and it's great.

Quote:
1. Desktop text and icons are the only two things which look better with DVI

2. Colors thru HDTV Out are richer and deeper than DVI.

3. Video (moving images: movies, TV [thru ATI Elite], etc) is richer and *more* detailed on HDTV Out. DVI has a washed out or slightly out-of-focus look. Doesn't look as detailed.

Sounds like your TV isn't properly calibrated for video levels over HDMI. Not sure why.

Quote:
4. This is very annoying with DVI: When you hook up your TV to the HTPC using DVI-->HDMI connection, image adjustment controls (which let you move your image up/down/left/right so that it's centred) are gone! I have seen this when using DVI to hook up to my desktop NEC monitors, but it was never an issue. But with TV's it is!!! I have read quite a few forums, and the missing controls (in DVI mode) to centre image on TV's seems to be haunting a lot of people with various makes and models of HDTV's.

That's a TV problem. You should be able work around it by using the overscan compensation in the nVidia drivers.

Quote:
It could be that I am missing something, but based on my experience I would recommend that stick with the HDTV Out port Nvidia has provided on the card. It is there for a reason. DVI output is NOT meant for TV's, and it shows.

DVI works fine on many TVs. Your TV seems unable to deal with it correctly though.

Quote:
May be that some of the artifacts I saw won't be a big deal or too obvious on smaller screen TV's, but on a large TV, it is VERY obvious!.

Definitely possible.
 

·
Registered
Joined
·
676 Posts
I have an xfx Nvidia 7950 GT connected through DVI to a more than 3 year old Samsung HLN56W DLP. All I can say is - banding all over the board. Both VMR9 and overlay. Forced VMR9 to 0-255, and then to 16-255 - same thing. Recently plugged in a 360 HD DVD add-on - horrendous banding in PowerDVD (which is using overlay), something I haven't seen since 1999, when the first dirt cheap DVD players came to market. Switched to VGA - banding was the same, everything else worse.

I also thought it might be my old TV's fault, but it looks so much better receiving HD 1080i signal through component from my satellite receiver - banding is almost non-existent (you have to look long and hard to notice any, as opposed to it being in your face all the time with the HTPC), deep colours, very good black levels. I am starting to doubt the very idea of HTPCs being able to provide hi-def picture comparable to dedicated CE devices. What is wrong here? Could it be that the earlier generation HD sets just can't process non-component signals well, or the home theatre software engineers at Nvidia (didn't have much different results with ATI either) were all laid off once decent SD DVD playback on PC was achieved?
 

·
Registered
Joined
·
370 Posts
Ditcho,


It think the banding problem is that your (old) DLP only does 8-bit color processing. The newer sets do 10-bit color processing.

The old sets also only had about 1000:1 contrast ratio. The new ones have 10000:1 contrast ratio.

Add the fact that the new Samsungs will accept 1080p over HDMI, and it might be time for an upgrade.

Prices have come way down, and when the 2007 models come out in a few months, they'll practically be giving away the 2006 models.
 
1 - 6 of 6 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top