AVS Forum banner
Status
Not open for further replies.
1 - 19 of 19 Posts

·
Registered
Joined
·
232 Posts
Discussion Starter · #1 ·
There has been some discussion in some of the high-end upscaling DVD player threads about true 10-bit digital video, which is passed exclusively thru HDMI.


We are all familiar with DVI being an 8-bit technology, but with the advent of players such as the latest top-of-the-line model from Denon which can pass 10-bit video through the YCbCr colorspace on HDMI (as opposed to RGB), the question arises as to whether all HDMI-equipped displays can resolve the 10-bit information accurately (i.e. whether it is a standard part of the HDMI protocol), or some, or none at all.


The benefits of 10-bit video over 8-bit, as I understand them, are akin to the benefits of multi-bit Audio DACs over preceeding generations of DACs in CD players. Essentially this means smoother video, less contouring (solarization) and better grayscale gradation.


Does anyone know which HDMI-equipped panels accept the 10-bit input (HDMI YCbCr), or whether they all do as standard? I suspect not all do, as we already know many manufacturers did not implement the full HDMI spec on their inputs (no audio, etc.).
 

·
Registered
Joined
·
4,478 Posts
Quote:
Originally Posted by YellowCows
There has been some discussion in some of the high-end upscaling DVD player threads about true 10-bit digital video, which is passed exclusively thru HDMI.


We are all familiar with DVI being an 8-bit technology, but with the advent of players such as the latest top-of-the-line model from Denon which can pass 10-bit video through the YCbCr colorspace on HDMI (as opposed to RGB), the question arises as to whether all HDMI-equipped displays can resolve the 10-bit information accurately (i.e. whether it is a standard part of the HDMI protocol), or some, or none at all.


The benefits of 10-bit video over 8-bit, as I understand them, are akin to the benefits of multi-bit Audio DACs over preceeding generations of DACs in CD players. Essentially this means smoother video, less contouring (solarization) and better grayscale gradation.


Does anyone know which HDMI-equipped panels accept the 10-bit input (HDMI YCbCr), or whether they all do as standard? I suspect not all do, as we already know many manufacturers did not implement the full HDMI spec on their inputs (no audio, etc.).
Perhaps this one?

http://www.vanns.com/shop/servlet/it...ures/462999671
 

·
Registered
Joined
·
32,172 Posts
DVDs are encoded with 8-bit video, so this feature is particularly irrelevant imho. Upscaling resolution can at least provide some smoothing on certain contours. I don't see how you could meaningfully "upscale" color. And even if you could, I can't imagine how it would make a difference worth spending any money for.
 

·
Registered
Joined
·
8,317 Posts
Quote:
Originally Posted by rogo
DVDs are encoded with 8-bit video, so this feature is particularly irrelevant imho.
But DVI wants 8bit RGB, while DVDs are encoded in 8bit YCbCr, as far as I know. Converting 8bit YCbCr to 8bit RGB can result in losing a bit of information (rounding errors etc). So being able to send YCbCr to the display does sound better. About 10bit: Don't some MPEG2 decoders internally calculate at 10bit to avoid rounding errors? I believe to remember having read that somewhere. It wouldn't harm to get those 10bits YCbCr from the MPEG2 decoder and send them directly to the display. Converting 10bit YCbCr from the MPEG2 decoder to 8bit RGB doesn't sound too good to me, even if the DVD is encoded in 8bit YCbCr.


But perhaps someone with more knowledge of the internal data paths can help out here?
 

·
Registered
Joined
·
8,948 Posts
For those who like to plan ahead, what about HD/BD-DVD? Won't they be at least 10 bit?
 

·
Moderator
Joined
·
23,031 Posts
Quote:
Originally Posted by madshi
No, they'll still be 8bit YCbCr, AFAIK.
Hopefully they'll be encoded as, at least, 4:2:2 instead of 4:2:0. :) Or even 4:4:4.


larry
 

·
Registered
Joined
·
8,948 Posts
Quote:
Originally Posted by madshi
No, they'll still be 8bit YCbCr, AFAIK.
What about with HDMI?
 

·
Registered
Joined
·
8,317 Posts
I was talking about how the data was encoded on the HD-DVD/BluRay disc. HDMI is only the transport medium. HDMI doesn't care about whether the data comes from a DVD or from anywhere else.
 

·
Registered
Joined
·
8,948 Posts
Quote:
Originally Posted by madshi
I was talking about how the data was encoded on the HD-DVD/BluRay disc. HDMI is only the transport medium. HDMI doesn't care about whether the data comes from a DVD or from anywhere else.


FWIW, in another AVS forum discussing the iScan VP30, Joe Murphy, Jr. said:


"There is one person who says the HDMI boards can accept native rate. I can't confirm that, but anything's possible. If it turns out to be true, the HDMI board would actually be better than the DVI-HDCP board. The reason is that even though the plasma uses 10-bit processing, the DVI-HDCP board needs an 8-bit RGB input whereas the HDMI board can accept a 10-bit YCbCr input. "
 

·
Moderator
Joined
·
23,031 Posts
There are a couple DVD players that can send 10bit YCbCr out HDMI. 10bit processing is used on the 8bit source in the deinterlacer/scaler and the resultant 10bit YCbCr 4:2:2 is output via HDMI.


larry
 

·
Registered
Joined
·
232 Posts
Discussion Starter · #14 ·
Quote:
Originally Posted by PooperScooper
There are a couple DVD players that can send 10bit YCbCr out HDMI. 10bit processing is used on the 8bit source in the deinterlacer/scaler and the resultant 10bit YCbCr 4:2:2 is output via HDMI.


larry
Not only that, but upcoming scalers from the likes DVDO and Crystalio (and upgraded Lumagens) are said to be able to output 10-bit video via HDMI.
 

·
Registered
Joined
·
32,172 Posts
There is a ton of info on BluRay and HD-DVD in a very long thread located in the HD Software Media forum. If you search on things like 4:2:0 and other search terms long enough, you'll learn more than you ever wanted. :)


I still don't think it matters a whit that a 480-line DVD source with 8-bit color gets output at 10-bit via HDMI and wouldn't think twice about looking for the input on my HDTV. This seems even less interesting than the search for 1080p over HDMI.
 

·
Registered
Joined
·
6,092 Posts
Quote:
Originally Posted by madshi
But DVI wants 8bit RGB, while DVDs are encoded in 8bit YCbCr, as far as I know. Converting 8bit YCbCr to 8bit RGB can result in losing a bit of information (rounding errors etc). So being able to send YCbCr to the display does sound better. About 10bit: Don't some MPEG2 decoders internally calculate at 10bit to avoid rounding errors? I believe to remember having read that somewhere. It wouldn't harm to get those 10bits YCbCr from the MPEG2 decoder and send them directly to the display. Converting 10bit YCbCr from the MPEG2 decoder to 8bit RGB doesn't sound too good to me, even if the DVD is encoded in 8bit YCbCr.


But perhaps someone with more knowledge of the internal data paths can help out here?
We have done some discussions in the past in the FP $3500+ forum about this.


First thing to remember is that YCbCr is a transmission format. All video starts at the very beginning as RGB and HAS TO eventually go back to RGB at some point.


Second, full computer 8 bit RGB is 16 million colors. 8bit Video RGB is about 10.6 millions colors. Now here's the shocker, 8 bit YCrCb is only about 3 million colors.


Going from 8-bit YCrCb to 8-bit RGB is basically lossless since there are many more color combinations in the RGB color map.


Letting your source do the YCrCb to RGB conversion could easilly be just as accurate as letting your display do it. I know my Bravo D1 does a better job IMO than my projector, but of coarse the opposite could equally be true - the display may do a better YCrCb to RGB conversion.


-Mr. Wigggles
 

·
Registered
Joined
·
6,092 Posts
Quote:
Originally Posted by PooperScooper
Hopefully they'll be encoded as, at least, 4:2:2 instead of 4:2:0. :) Or even 4:4:4.


larry
Blu-Ray will likely be 4:2:0 YCrCb since the RGB to YCrCb (or YUV) conversion is part of virtually any visiual compression codec (VC-1 MPEG2 H-264 etc)


The reduction of chroma bandwidth is considered a "lossless" step in the compression process. I somewhat disagree but it is hard to argue that not much is lost going from 4:4:4 RGB to 4:2:0 YCrCb considering that the data reduction 50% right off the top.


So it will most likely be 4:2:0 YCrCb but hopefully they will up the quantization up to 10 bit (Rogo, has that been decided at this point?). 10 bit YCrCb video has nearly 200 million colors which should be plenty.


-Mr. Wigggles
 

·
Registered
Joined
·
8,317 Posts
Thanks MrWigggles. So do you think that 10bit YCrCb transport over HDMI is not really better than 8bit RGB?


Rogo, what do you mean with "This seems even less interesting than the search for 1080p over HDMI"? Does that mean you don't care about 1080p over HDMI? Or am I misunderstanding you here? Don't you never ever want to use an external scaler?
 

·
Registered
Joined
·
6,092 Posts
Quote:
Originally Posted by madshi
Thanks MrWigggles. So do you think that 10bit YCrCb transport over HDMI is not really better than 8bit RGB?


...
For the most part, I think the differences will be very minor. IF the source material were in fact encoded at 4:4:4 10 bit RGB then I would definitely prefer the 4:4:4 8 bit RGB (i.e. sacrifice some bit depth for full chroma bandwidth.


But since I'm pretty sure HD-DVD's will be encoded at 4:2:0 then I think that the 4:2:2 10 bit YCrCb is obviously the best way to transmit the data with 4:4:4 8 bit RGB being a very very close second.


Showing a very dark scene on a very bright display should will expose some posterization with 8 bit YCrCB and to a much lesser extent 8 bit RGB.


-Mr. Wigggles
 
1 - 19 of 19 Posts
Status
Not open for further replies.
Top