Nvidia DVI to HDTV (HDMI) is *NOT* a good idea!!! - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 6 Old 01-24-2007, 05:50 AM - Thread Starter
Member
 
badshah2000's Avatar
 
Join Date: Jan 2007
Posts: 127
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 17
I have built an HTPC, which uses Nvidia 7600GS card. TV is a Toshiba 72MX196 (72" 1080p DLP).

After extensive testing and back and forths, I have concluded that using HDTV out on this card (which connects to Component In of TV) is far superior to using the (very tempting) DVI-->HDMI link. Believe me, I did try to calibrate TV for each input using all the reference images and "guides" I could get my hands on. I even have a DVD that walks you thru calibrating your tv using its controls.

1. Desktop text and icons are the only two things which look better with DVI
2. Colors thru HDTV Out are richer and deeper than DVI.
3. Video (moving images: movies, TV [thru ATI Elite], etc) is richer and *more* detailed on HDTV Out. DVI has a washed out or slightly out-of-focus look. Doesn't look as detailed.
4. This is very annoying with DVI: When you hook up your TV to the HTPC using DVI-->HDMI connection, image adjustment controls (which let you move your image up/down/left/right so that it's centred) are gone! I have seen this when using DVI to hook up to my desktop NEC monitors, but it was never an issue. But with TV's it is!!! I have read quite a few forums, and the missing controls (in DVI mode) to centre image on TV's seems to be haunting a lot of people with various makes and models of HDTV's.

It could be that I am missing something, but based on my experience I would recommend that stick with the HDTV Out port Nvidia has provided on the card. It is there for a reason. DVI output is NOT meant for TV's, and it shows. May be that some of the artifacts I saw won't be a big deal or too obvious on smaller screen TV's, but on a large TV, it is VERY obvious!.

By the way, in case someone is wondering, my HTPC is Core 2 Duo E6300, with Windows MCE 2005. I am also running Nvidia PureVideo. And I used both classic and modern interfaces of NVidia control centre. And I did play with *every* fricken setting it had to offer...

thx.
badshah2000 is offline  
Sponsored Links
Advertisement
 
post #2 of 6 Old 01-24-2007, 06:34 AM
Senior Member
 
volley's Avatar
 
Join Date: May 2003
Location: Cincinnati, OH
Posts: 397
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Probably should have changed your title. It isn't a universal truth. I happen to have an Nvidia 7950GX2 connected to a Sony Pearl on a 136" screen. One of the best images I have seen. Going to Component skews the color accuracy for me. Also you can calibrate your overall color space using even low end tools like Spyder2 Pro or SpyderTV Pro. You can also tweak your viewing preferences in your player if movies are your problem.

Not saying your post isn't valid, just saying it is a data point to consider. Every TV is going to be different with how well they do each individual connector type. The take-away should be try both and see what works for you. Individual mileage may vary type of thing.
volley is offline  
post #3 of 6 Old 01-24-2007, 07:48 AM
Senior Member
 
skepticon's Avatar
 
Join Date: Dec 2005
Posts: 368
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Also, I'll add:

Regarding centering, you can use your television's Service Menu (not the user menu) to adjust the horizontal and vertical centering. If your display is not centered properly using DVI, it is because they did a half-assed job of setting this up at the factory (they usually do). Check the "display devices" forums, and look for the thread specific to your make/model to find out how to access the Service Menu.

Also, I have found that my last Toshiba DLP did not allow for full PC colorspace levels, forcing me to use the nvidia control panel color correction to compensate.
My new Samsung DLP allows for the full range of 0-255 PC colorspace without having to compress the output from the PC, resulting in a much better picture.
No more banding!
skepticon is offline  
post #4 of 6 Old 01-24-2007, 07:53 AM
AVS Addicted Member
 
stanger89's Avatar
 
Join Date: Nov 2002
Location: Marion, IA
Posts: 17,406
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 119 Post(s)
Liked: 144
Quote:
Originally Posted by badshah2000 View Post

I have built an HTPC, which uses Nvidia 7600GS card. TV is a Toshiba 72MX196 (72" 1080p DLP).

After extensive testing and back and forths, I have concluded that using HDTV out on this card (which connects to Component In of TV) is far superior to using the (very tempting) DVI-->HDMI link. Believe me, I did try to calibrate TV for each input using all the reference images and "guides" I could get my hands on. I even have a DVD that walks you thru calibrating your tv using its controls.

That's a problem with your TV, not your nVidia card or the HDMI link. I run DVI-HDMI to my IN76 and it's great.

Quote:
1. Desktop text and icons are the only two things which look better with DVI
2. Colors thru HDTV Out are richer and deeper than DVI.
3. Video (moving images: movies, TV [thru ATI Elite], etc) is richer and *more* detailed on HDTV Out. DVI has a washed out or slightly out-of-focus look. Doesn't look as detailed.

Sounds like your TV isn't properly calibrated for video levels over HDMI. Not sure why.

Quote:
4. This is very annoying with DVI: When you hook up your TV to the HTPC using DVI-->HDMI connection, image adjustment controls (which let you move your image up/down/left/right so that it's centred) are gone! I have seen this when using DVI to hook up to my desktop NEC monitors, but it was never an issue. But with TV's it is!!! I have read quite a few forums, and the missing controls (in DVI mode) to centre image on TV's seems to be haunting a lot of people with various makes and models of HDTV's.

That's a TV problem. You should be able work around it by using the overscan compensation in the nVidia drivers.

Quote:
It could be that I am missing something, but based on my experience I would recommend that stick with the HDTV Out port Nvidia has provided on the card. It is there for a reason. DVI output is NOT meant for TV's, and it shows.

DVI works fine on many TVs. Your TV seems unable to deal with it correctly though.

Quote:
May be that some of the artifacts I saw won't be a big deal or too obvious on smaller screen TV's, but on a large TV, it is VERY obvious!.

Definitely possible.

See what an anamorphoscopic lens can do,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
stanger89 is online now  
post #5 of 6 Old 01-24-2007, 08:17 AM
Advanced Member
 
ditcho's Avatar
 
Join Date: Nov 2003
Location: Calgary, Alberta
Posts: 657
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I have an xfx Nvidia 7950 GT connected through DVI to a more than 3 year old Samsung HLN56W DLP. All I can say is - banding all over the board. Both VMR9 and overlay. Forced VMR9 to 0-255, and then to 16-255 - same thing. Recently plugged in a 360 HD DVD add-on - horrendous banding in PowerDVD (which is using overlay), something I haven't seen since 1999, when the first dirt cheap DVD players came to market. Switched to VGA - banding was the same, everything else worse.
I also thought it might be my old TV's fault, but it looks so much better receiving HD 1080i signal through component from my satellite receiver - banding is almost non-existent (you have to look long and hard to notice any, as opposed to it being in your face all the time with the HTPC), deep colours, very good black levels. I am starting to doubt the very idea of HTPCs being able to provide hi-def picture comparable to dedicated CE devices. What is wrong here? Could it be that the earlier generation HD sets just can't process non-component signals well, or the home theatre software engineers at Nvidia (didn't have much different results with ATI either) were all laid off once decent SD DVD playback on PC was achieved?
ditcho is offline  
post #6 of 6 Old 01-24-2007, 09:48 AM
Senior Member
 
skepticon's Avatar
 
Join Date: Dec 2005
Posts: 368
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Ditcho,

It think the banding problem is that your (old) DLP only does 8-bit color processing. The newer sets do 10-bit color processing.
The old sets also only had about 1000:1 contrast ratio. The new ones have 10000:1 contrast ratio.
Add the fact that the new Samsungs will accept 1080p over HDMI, and it might be time for an upgrade.
Prices have come way down, and when the 2007 models come out in a few months, they'll practically be giving away the 2006 models.
skepticon is offline  
Reply Home Theater Computers

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off