AVS Forum banner
Status
Not open for further replies.
1 - 20 of 41 Posts

·
Registered
Joined
·
30 Posts
Discussion Starter · #1 ·
Ok, I am sure this is very basic question for this group:


Do the DVI inputs on current TV sets(like, say a Sony 32" direct view HD set) accept a signal from the DVI outputs on a computer video card? Or are these inputs only suitable for HD STBs & the new DVD players?


I have received conflicting info on this, please let me know.
 

·
Registered
Joined
·
807 Posts
In addition, depending on the intend of the DVI input on the TV there may be a whole lot of overscan to deal with. So, being able to get into the service menu and adjust overscan may be a must as well.


That's the case with the GWII (50" or 60")...


-Rob-
 

·
Registered
Joined
·
10,500 Posts
I think my Sony XBR800 has the same type of DVI-HDCP (High-bandwidth Digital Content Protection) input. It's intended for the new DVI-HDCP settop recievers, etc. like you've been told. But it seems alot of folks have been able to get it work for a computer input as well with the right kind of DVI cable, etc. Try at your own risk though.


Here are a couple recent AVS threads on the subject:
~ 1080i output through DVI with Radeon?
~ 1920x1080p->1080i w/ DVI cable or VGA transcoder?
~ Radeon 8500 DVI Output
~ Need help Troubleshooting DVI Sparklies
~ Sony DVI/HDCP connect to a PC? Includes details on MonInfo Ashley Saldanha's tool for analyzing the signals different displays may accept.
~ Computer Input for Sony HDTV CRTs


Some DVI and HDCP links:
~ HDCP: what it is and how to use it
~ How DVI, HDMI and HDCP work
~ Illustration showing different DVI connectors


Some basic ATI custom resolution links:
~ RADEON GUIDE
~ Karnis's Custom Resolution Guide for 1080i HDTV-HTPC-POWERSTRIP-RADEON
~ ati custom resolution without powerstrip
 

·
Registered
Joined
·
1,087 Posts
The HDCP portion of DVI-HDCP is copy protection for the DVI ouputs of STBs. The TV doesn't care if there is or isn't HDCP communication/handshaking on the DVI-HDCP link. I was worried about the HDCP portion until I contacted Silicon Image, the develop of DVI.


STBs do (or will care) about HDCP: they won't output a HD DVI stream unless they handshake with a DVI-HDCP TV. The idea is that this makes STBs with DVI-HDCP a closed system: the STB would not output hi-res to a DVI recorder (if consumers had such a thing), thus preventing perfect digital copies of HD programs. The operating frequencies of the digital DVI asignals re so hi that you can't create a Y-cable/splitter and tap into the bitstream as it's being fed from the STB to the TV.


Otherwise, it's just a DVI input, with all the overscan problems mentioned previously.
 

·
Registered
Joined
·
10,500 Posts
Interestin stuff, vkristof.


I got the impression from ATI that newer video cards with the DVI-I interface (as opposed to DVI-D) might be more friendly to these kinds of HDTV inputs. Can you see any logic in this?
 

·
Registered
Joined
·
1,207 Posts
Quote:
Originally posted by vkristof
The HDCP portion of DVI-HDCP is copy protection for the DVI output's of STBs. The TV doesn't care if there is or isn't HDCP communication/handshaking on the DVI-HDCP link.
And I'm praying that it will stay that way.

Hopefully they don't change it and HTPC will be able to continue as is.



peace
 

·
Registered
Joined
·
597 Posts
This brings up yet another question, is the DVI output on the ATI cards a DVI-I? If so, can I not connect them to my projector that has a DVI-D input, or is this just a matter of selecting the correct cable?
 

·
Registered
Joined
·
807 Posts
The ATI cards have a DVI-I output (at least that's what my 9700 AIW has). This means you can hook it up to all forms of DVI on the projector/TV side: DVI-I, DVI-A, or DVI-D. The latter two are simply subsets of DVI-I. For DVI-A the analog part is used (regular RGB, that's how the 9700 provides an analog video signal through a simple connector that breaks out the analog part). DVD-D is the digital part, that's what my GWII uses.


That means you don't need a DVI-I cable if your TV has a DVD-D input. Just a DVI-D cable will do (fewer leads since the analog part is missing, therefore cheaper). Of course, a DVI-I cable would work fine too, but a DVI-A cable won't (if such a thing exists, never seen one that only passes the analog part).


-Rob-
 

·
Registered
Joined
·
10,500 Posts
Thanks for the clarification, Rop. Based on some reading I've been doing over in Display Devices, it appears the DVI-HDCP port on my Sony XBR800 is probably also a DVI-D.


Also, I believe ATI's pre-8500 cards are DVI-D, rather than DVI-I.


I've been thinking about this alot lately... one reason ATI may be recommending it's newer DVI-I cards for use with HDTVs is their broader range of compatibility. However, there may be another more subtle reason as well.


The newer DVI-I ATI cards also have a TMDS transmitter that is HDCP compliant. Although this may have no bearing on general computer connectivity now, as vkristof indicated, this would seem to be part of the HDCP handshaking that may be necessary to display copy-protected high-def content from the DVI ports on these cards in the future.


For all I know, this may even be coming into play right now for people trying to use their DVI output to watch HD broadcasts from a tuner card like the FusionHD.


This is mostly guesswork on my part though.
 

·
Registered
Joined
·
10,500 Posts
Another thing that concerns me about going directly into the DVI-HDCP port, is the manner in which RGB is converted to YUV. Some YUV video equipment isn't designed to handle all the possible colors in RGB color space. If it isn't properly filtered so it's NTSC/ATSC compliant, there is a possibility of damage being done.


Where or how this filtering may be done in a DVI-I->DVI-HDCP setup is still unclear to me. I suppose HDTVs may have some protective filtering built-in, but I'm not sure about that.
 

·
Registered
Joined
·
10,500 Posts
After poking around a bit more, I've gathered a little more info on some of this.


DVI can be used to transmit a number of different kinds of video signals. They seem to break down like this:


VESA 24-bit Digital RGB (used for computer displays)

EIA-861 24-bit Digital RGB (HDTV)

BT 656/601 Digital Video

Analog RGB (also for computers, and the same as VGA uses)


(Analog RGB requires DVI-I, but the others only appear to require DVI-D.)


The relevant formats here are VESA Digital RGB, and the EIA-861 Digital RGB. VESA Digital RGB is what's typically used for DVI computer monitors, while EIA-861 Digital RGB is typically used for HDTV.


The HDCP-DVI ports on current TVs should be designed for EIA-861 Digital RGB. (The specifications for each TV should indicate whether the port is EIA-861 or VESA compliant.)


The EIA-861 specifications include resolutions and timings specifically for ATSC HDTV formats such as 480i, 480p, 720p, 1080i. Aside from the resolutions and timings though, it's difficult to see much difference between the VESA Digital RGB and EIA-861 Digital RGB. Both appear in essence to be 24-bit Digital RGB. (It's unclear to me though if this RGB data has to be converted to YUV before being displayed on a typical HDTV, since the CRT itself uses RGB. One would think that this should not be necessary on a true digital TV.)


Since EIA-861 Digital RGB is designed for HDTV broadcasts, video transmitted in this standard may typically be pre-filtered for NTSC/ATSC compliance. Analog NTSC YUV video is limited to 220 levels of luminance. Values beyond this are not considered legal for broadcast. To insure NTSC compliance, the palette of 24-bit RGB video, which normally has 256 levels, is usually clipped below 16 and above 235 in video intended for broadcast.


The question is: could RGB values outside this potentially damage an HDTV that is only intended for NTSC/ATSC compliant EIA-861 RGB? The couple of video technicians I've spoken to say they've never heard of illegal RGB values damaging an NTSC monitor, except in some much older video equipment where different voltages were also involved. Their experience with the new DVI inputs may be limited though. And this would not be as much of an issue with other kinds of inputs such as Composite, S-video and YPbPr, because these all use an analog YUV color space, rather than RGB.


However, I also spoke to a manufacturer of a high-end video scaler which includes both VESA Digital RGB and EIA-861 Digital RGB options for their DVI output. And they told me that they had no trouble using the VESA computer mode to go into the EIA-861 HDCP-DVI port on some Sony HDTVs they tested.

Added 5/3/03: In addition, though I haven't see the EIA-861 specs in the flesh, so far I have seen nothing to indicate that it contains any particular palette requirements on top of it's HD resolutions and timings. It's usually just referred to as a "24-bit Digital RGB" format.


In the absence of any hard facts from the TV mfrs themselves, I guess my conclusion from all this is that the chance of harming a TV by attaching a computer to the HDCP-DVI port is probably relatively low, particularly if a restricted RGB palette is used. I think alot of the admonitions by mfrs and salespeople against doing this are more related to burn-in, ignorance and towing the company line than the capabilities of the DVI port to handle computer signals.


However, if a YPbPr computer input looks acceptable, and you plan to connect other STBs to the DVI port, then there's probably no point in taking even a small risk of damaging it.


And there may be variations in how this would work, depending on the TV and videocards involved.
 

·
Premium Member
Joined
·
1,374 Posts
here's a link for DVI chip used by some Sony WS500 series rptvs:
GM7030 overview


Looks like even if you hook up HTPC, and DVI interface says it "accepts" resolution, gm7030 will rescale it to the few scan rates the set actually supports.


However, something else interesting in the service manual. It lists separate settings for "VGA" as well as standard 1080i, 480p, and 480i. Perhaps its running 1080i, but with video settings optimized for computer desktop. Or perhaps it can run something close like unscaled 800x600.


Sorry, can't verify, don't have set...
 

·
Registered
Joined
·
10,500 Posts
Interesting tidbit, Dave. As you say, it looks as though that Faroudja chip is designed to convert both analog and digital RGB to the scan rates supported by the CRT, allowing alot more flexibility in the type of DVI input.


That chip looks more advanced than the one used on my TV though. If it supports analog RGB, then it should use a DVI-I style port, rather than the DVI-D style used on my XBR-800 and some other EIA-861 DVI TVs.


When I get chance, I'll look in the service manual of the XBR800 to see if there's any mention of VESA resolultions. I kinda doubt there would be though, because I haven't heard anyone successfully running those modes on sets with just the EIA-861 HDCP DVI-D digital input. Worth a look though.


If the set directly supported VESA modes, you'd kinda expect that to be mentioned in their advertised features and user manual.


Also, added a small addendum to Post #15 above.
 

·
Registered
Joined
·
265 Posts
Heads up on the toshiba 34hdx82, no luck on that. The DVI doesn't work. it's DVI-D Single link, everything gets hooked up, etc.. etc.. but the best people have acomplished was to get the POST screens to work, when it loads to windows its a no no, so people assumed it was driver related. The best result with an nvidia card was 640x480 in windows and yet still there was problems. I think I will contact Toshiba, however in the manual it specifically says not for PC use... ah well I don't buy that ;)
 

·
Registered
Joined
·
10,500 Posts
Yep, there's bound to be alot of variation on how this will work depending on the video cards and TVs involved.


As mentioned above, the newer ATI Radeons (8500 and above) with DVI-I & HDCP-compliant TMDS transmitter seem like the best bet to me, so far.


Gotta make sure you're using resolutions and refresh rates the TV can handle as well though, regardless of the card.
 

·
Registered
Joined
·
1,082 Posts
Quote:
Originally posted by ADU
Yep, there's bound to be alot of variation on how this will work depending on the video cards and TVs involved.


As mentioned above, the newer ATI Radeons (8500 and above) with DVI-I & HDCP-compliant TMDS transmitter seem like the best bet to me, so far.


Gotta make sure you're using resolutions and refresh rates the TV can handle as well though, regardless of the card.
I think I'll try an ATI Radeon 9000 card on my 34XBR800. However, does anyone know where to find XBR800 "supported" resolutions & refresh rates?
 
1 - 20 of 41 Posts
Status
Not open for further replies.
Top