AVS Forum banner
Status
Not open for further replies.
1 - 10 of 10 Posts

·
Registered
Joined
·
1,512 Posts
Discussion Starter · #1 ·
I've been curious about using DVI for a while. Unfortunately, my current projector doesn't have a DVI in.


IIRC the DVI spec only allows for 8bpp, or 24 bit color.


Now, I know from my HTPC adventures that 24 bit color can produce some contouring (especially on primaries, such as large red or green areas) - that's one reason why the Radeon cards are rather popular (10 bit DACs). It seems that using DVI would negate this benefit, since only 8bpp make it out the door.


Does using a DVI link produce more contouring than an analog link?


Has anyone seen any [false] contouring introduced by DVI?


-Jon
 

·
Registered
Joined
·
1,047 Posts
Since both DVDs and HDTV are 8 bit sources, having an 8 bit interface to the display should be fine as long as there is no reprocessing of the signal that drops bits.
 

·
Registered
Joined
·
10,201 Posts
The main benefit of the 10-bit DACs on a Radeon is for gamma correction. Except for the most recent versions, the Radeon only had 24-bit color---so 8 bits was passed through a gamma lookup table to produce a 10-bit result to send to the DACs.


So if the display device in question has gamma correction capability built in, you can still get all the benefits of the Radeon's 10-bit DACs. Personally that's the way I think it should be anyway---a projector should be responsible for its own gamma.


Now of course, more recently some video cards have implemented "deeper" color depths. But still, as Bob has pointed out, its of limited use for DVD and HDTV... it's useful to keep the extra bits around when scaling those sources, though.
 

·
Registered
Joined
·
1,512 Posts
Discussion Starter · #4 ·
Michael,


I see your point. The DVI spec does recommend that display devices implement a gamma function (2.2 I believe).


What about oversampling and such, though?


A future algorithm (such as DCDi) might decide to interoplate between different colors (i.e. to smooth out areas, etc.). I was under the impression that some of the Sage chips did a type of oversampling, using 10 bit processing and using 10 bit DACs for output. For analog displays (CRT, for ex) all 10 bits could potentially be displayed. Using DVI would prevent this.


I'm especially interrested in this as resolutions continue to grow. We're now scaling 480p sources to 720p - soon, it will be to greater resolutions. As CPU power increases, scaling algorithms will becomes smarter, and new crazy methods will be thought of. Having only 8bpp seems perhaps rather limiting.


I'm approaching this the way one might approach upsampling of red book audio - 44.1KHz can be upsampled, and many people prefer the added [interpolated] resolution.


Similarly, if current or future Sage chips (for ex) perform any expansion of color depth (i.e. to 10 bits), this would be lost using DVI. If one were to send SDI directly in to the projector, which would then feed it to the scaler/processor, you would be able to take advantage of any >8 bit processing. DLP is certainly one technology which could benefit from this, as TI claims that 30 bit color depth is available in still scenes (dropping to 24 in areas of motion).


Another spot this could be usefull is in color space conversion. Moving from the HDTV color matrix to the SDTV color matrix, component to RGB, etc... You can guarantee that you won't get more than 24 bits of information on the output, but you can't guarantee that the space of the solution is representable in 24 bits.


Or am I complete off in left field here?


-Jon
 

·
Registered
Joined
·
1,512 Posts
Discussion Starter · #5 ·



No one else?


-Jon
 

·
Registered
Joined
·
10,201 Posts
Everything you've said is a good point, Jon, and good support for the idea that all the scaling and other video processing should be performed after the DVI receiver (i.e., in the projector). Even if the original source is 8 bits, it always introduces noise to truncate back to 8 bits after any sort of video processing.


On the other hand, I believe HDMI has support for higher color depths. That will probably help this issue.
 

·
Registered
Joined
·
1,512 Posts
Discussion Starter · #7 ·
Hmmm... I must check out the HDMI angle.


Thanks, Michael.


-Jon
 

·
Registered
Joined
·
522 Posts
Hi Michael,

the usual way to do today is, using analog video in then you have video processing. With DVI you bypass the processing and go straight to the scaling.

Best

Armin
 

·
Registered
Joined
·
1,512 Posts
Discussion Starter · #9 ·
Armin2,


With DVI you can not always go straight to scaling. DVI does not support 480i rates, so deinterlacing is necessary for DVD sources before you can send the video out DVI. My point here is that once you introduce deinterlacing/scaling, the 8bpp limit of DVI may not be enough.


-Jon
 

·
Registered
Joined
·
3,102 Posts
Michael, you talk about HDMI. I have read the FAQ at the website for HDMI. I do have one question that I still do not understand. Since it is one cable to do this, will there be 2 HDMI outputs (I am led to believe that it will use the same connector as current DVI), one for audio to an audio reciever and one for video to a display (lets say PJ)?
 
1 - 10 of 10 Posts
Status
Not open for further replies.
Top