Bit-Color Analysis: HDMI 1.3 Potentials and Real-World Performance - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 10 Old 04-19-2007, 07:46 PM - Thread Starter
Member
 
crellion's Avatar
 
Join Date: Aug 2005
Posts: 115
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I've been doing some research on HDMI 1.3 increased color-depth capabilities. The max color-depth HDMI 1.3 support is 48-bit... Recently, more HDMI 1.3 displays have been announced including the new generation of Sony XBR displays. One of their flagship models - the 70" BRAVIA XBR® LCD Flat Panel HDTV KDL-70XBR3 has HDMI 1.3 capabilties. However if one goes deeper into its specs you see that it only has a 10-bit CCD chip. This means that any HDMI 1.3 "higher-bit" source fed into the TV would result in a much better color reproduction, but would not achieve the full 48-bit HDMI 1.3 spec capabilties.

One note here is that the current cost to make a 48-bit capable display is astronomical and not feasible for consumer markets. Even the newest A/V recievers from Onkyo and Denon which have 1.3 only support up to 36-bit color...

Anyways this is what I found so far. If you think I made some wrong statements, please let me know as I am still getting more information on this whole expanded color capabilities topic.
crellion is offline  
Sponsored Links
Advertisement
 
post #2 of 10 Old 04-20-2007, 05:05 PM
Advanced Member
 
ptsenter's Avatar
 
Join Date: Mar 2005
Posts: 805
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
As has been stated time and again 1.3 compliance per se does not mean much: all these nice features, e.g., DeepColor, xvYCC, hi-def audio, are optional. All devices with HDMI connector on the shelves today are automatically 1.3 compliant - all it's needed 480p and stereo. It's up to manufacturers to support them and how deep (intended).
ptsenter is offline  
post #3 of 10 Old 04-20-2007, 07:00 PM
AVS Special Member
 
Richard Paul's Avatar
 
Join Date: Sep 2004
Posts: 6,959
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 29
Quote:
Originally Posted by crellion View Post

I've been doing some research on HDMI 1.3 increased color-depth capabilities. The max color-depth HDMI 1.3 support is 48-bit...

True, but I would point out that though Deep Color can support 48-bit RGB that not even the HDMI organization expects it to be used in consumer equipment and they have even stated that on their website. In fact from what I have read in the HDMI 1.3 specs 36-bit RGB is the standard for Deep Color support since that is the only mode that is mandatory for Deep Color support. The 30-bit and 48-bit modes are optional modes though most, if not all, Deep Color displays will probably support 30-bit RGB as well.


Quote:
Originally Posted by crellion View Post

Anyways this is what I found so far. If you think I made some wrong statements, please let me know as I am still getting more information on this whole expanded color capabilities topic.

Well just to point this out but the PS3 was actually used in a demonstration at CES to show off the difference between 24-bit and 30-bit RGB. Deep Color support hasn't been enabled yet on the PS3 but it is very likely we will see that done by a firmware update sometime this year.
Richard Paul is offline  
post #4 of 10 Old 04-22-2007, 06:55 PM
Advanced Member
 
ptsenter's Avatar
 
Join Date: Mar 2005
Posts: 805
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Richard Paul View Post

In fact from what I have read in the HDMI 1.3 specs 36-bit RGB is the standard for Deep Color support since that is the only mode that is mandatory for Deep Color support. The 30-bit and 48-bit modes are optional modes though most, if not all, Deep Color displays will probably support 30-bit RGB as well.

Now, that's funny: even though Bravia,and PS3 for that matter, mentioned above, supports 480p and stereo it cannot claim 1.3 compliance because, as you properly noticed, if it supports Deep Color it requires to support 36-bit - 1.3a Spec, p.86. So, if it does not support 36-bit it's not compliant. It looks like inconsistency within the Standard itself.
ptsenter is offline  
post #5 of 10 Old 04-22-2007, 11:03 PM
AVS Special Member
 
Richard Paul's Avatar
 
Join Date: Sep 2004
Posts: 6,959
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 29
Quote:
Originally Posted by ptsenter View Post

Now, that's funny: even though Bravia,and PS3 for that matter, mentioned above, supports 480p and stereo it cannot claim 1.3 compliance because, as you properly noticed, if it supports Deep Color it requires to support 36-bit - 1.3a Spec, p.86. So, if it does not support 36-bit it's not compliant.

I think that you are jumping the gun for two reasons. First off what is required for Deep Color support is that the HDMI chips are able to either transmit or receive a 36-bit RGB signal. That says nothing about what is required of the display and a 10-bit display output might be considered sufficient for a Deep Color display. Secondly just because they used a 30-bit RGB signal for the PS3 demonstration does not mean that the PS3 is not capable of 36-bit RGB output. That might have just been the limitation of the source material they used for the demonstration or perhaps even of the display that was used.


Quote:
Originally Posted by ptsenter View Post

It looks like inconsistency within the Standard itself.

To me it looks like many, maybe even most, of the Deep Color displays that are coming out this year are best suited for 30-bit RGB. In and of itself that is not a bad thing and 1 billion color combinations is a good improvement when compared to 16.7 million color combinations.
Richard Paul is offline  
post #6 of 10 Old 04-23-2007, 01:13 AM
Advanced Member
 
ptsenter's Avatar
 
Join Date: Mar 2005
Posts: 805
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Richard Paul View Post

I think that you are jumping the gun for two reasons. First off what is required for Deep Color support is that the HDMI chips are able to either transmit or receive a 36-bit RGB signal. That says nothing about what is required of the display and a 10-bit display output might be considered sufficient for a Deep Color display. Secondly just because they used a 30-bit RGB signal for the PS3 demonstration does not mean that the PS3 is not capable of 36-bit RGB output. That might have just been the limitation of the source material they used for the demonstration or perhaps even of the display that was used.

1.
1.3a Spec p.86:
"Color depths greater than 24 bits are defined to be "Deep Color" modes. All Deep Color modes are optional though if an HDMI Source or Sink supports any Deep Color mode, it shall support 36-bit mode."

3a Spec p.4:
"shall _______ A key word indicating a mandatory requirement. Designers are [ITALIC]required[/ITALIC] to implement all such mandatory requirements."

Standard does not distinguish between chip and display, it talks about Sink and/or Source, and EDID associated with them describes capabilities of entire device, not its components. If display is 10-bit it cannot report it supports 12-bit regardless of capabilities of internal chip - otherwise it's cheating.

2. True, but guilty until proven or, at least, claimed otherwise.

Quote:


To me it looks like many, maybe even most, of the Deep Color displays that are coming out this year are best suited for 30-bit RGB. In and of itself that is not a bad thing and 1 billion color combinations is a good improvement when compared to 16.7 million color combinations.

First off I'm not talking about displays, but about standard itself. Secondly, I don't feel like I need to drop on my knees just because they are going to release substandard products. There are plenty of them as it is.
ptsenter is offline  
post #7 of 10 Old 04-23-2007, 05:33 PM
AVS Special Member
 
Richard Paul's Avatar
 
Join Date: Sep 2004
Posts: 6,959
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 29
Quote:
Originally Posted by ptsenter View Post

Standard does not distinguish between chip and display, it talks about Sink and/or Source, and EDID associated with them describes capabilities of entire device, not its components.

The definition you are using just isn't realistic. For instance take a HDMI 1.3 source like the HD-XA2 and what you have is source material at 16-bit per pixel (8-bit 4:2:0 YCbCr) being converted into 36-bit RGB.


Quote:
Originally Posted by ptsenter View Post

If display is 10-bit it cannot report it supports 12-bit regardless of capabilities of internal chip - otherwise it's cheating.

Are you sure of that or are you guessing? After all I can tell you here and now that the HD-XA2 can not actually accept source material at 36-bit RGB.


Quote:
Originally Posted by ptsenter View Post

First off I'm not talking about displays, but about standard itself.

I don't see anything wrong with the HDMI 1.3 standard. In fact it is rather innovative in terms of the features that were added to it and in my opinion it is the best digital video standard out there.


Quote:
Originally Posted by ptsenter View Post

Secondly, I don't feel like I need to drop on my knees just because they are going to release substandard products. There are plenty of them as it is.

I didn't know that being able to accept and display 30-bit RGB video was considered substandard nowadays . Not to be rude but do you have some sort of chip on your shoulder when it comes to HDMI? You seem to be trying to find reasons to dislike it.
Richard Paul is offline  
post #8 of 10 Old 05-04-2007, 03:37 PM
Senior Member
 
HDMI_Org's Avatar
 
Join Date: Jan 2007
Location: Sunnyvale, CA
Posts: 269
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
The HDMI spec says that if Deep Color is implemented, then the 12-bit mode is required, with 10-bit mode being an option.

As for a display with a 10-bit LCD, the key phrase from the HDMI spec is that is says the devices must "support" 12-bit. This can be done by having the video processor accept a 12-bit HDMI stream, and then eventually compressing it down to 10-bit to the LCD glass. In fact, most video processors prefer to have the video contain a higher level of precision than the actual display so that their algorithms are more precise.
HDMI_Org is offline  
post #9 of 10 Old 05-11-2007, 10:53 AM
Advanced Member
 
ptsenter's Avatar
 
Join Date: Mar 2005
Posts: 805
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by HDMI_Org View Post

In fact, most video processors prefer to have the video contain a higher level of precision than the actual display so that their algorithms are more precise.

It's true video processors, or more precisely algorithms employed by them, use more digits for intermediate calculations which rounded-off or truncated for a final result - to preserve an original precision, not to loose it.

Quote:


As for a display with a 10-bit LCD, the key phrase from the HDMI spec is that is says the devices must "support" 12-bit. This can be done by having the video processor accept a 12-bit HDMI stream, and then eventually compressing it down to 10-bit to the LCD glass.

I'd like to see A/V receiver which accepts DD 5.1 (optical or coax), downmixes that to stereo, drives only two speakers, has DD logo on it and claims that it supports DD. Dolby Labs would be all over a manufacturer requesting to discontinue the product or remove logo and claim. There might be a legitimate application of such product, but it's not DD compliant; and from Dolby Lab's point of view it's substandard product.
ptsenter is offline  
post #10 of 10 Old 05-21-2007, 05:00 PM
Senior Member
 
HDMI_Org's Avatar
 
Join Date: Jan 2007
Location: Sunnyvale, CA
Posts: 269
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by ptsenter View Post

I'd like to see A/V receiver which accepts DD 5.1 (optical or coax), downmixes that to stereo, drives only two speakers, has DD logo on it and claims that it supports DD. Dolby Labs would be all over a manufacturer requesting to discontinue the product or remove logo and claim. There might be a legitimate application of such product, but it's not DD compliant; and from Dolby Lab's point of view it's substandard product.

Well, the difference between a DD 5.1 vs. stereo experience is quite significant, no question about that. However, I don't believe it's quite fair to say that going from a 12-bit LCD glass to a 10-bit LCD glass is a fair comparison to dropping from DD 5.1 to stereo.

I would expect that the manufacturers will do a pretty good job marketing which displays are 10, 12, or 16-bit. From a branding perspective, we feel that getting a display to cross over into the "billion color" threshold (compared to today's TVs at 16.7 million colors) is a reasonable bar to allow the usage of the Deep Color name.
HDMI_Org is offline  
Reply HDMI Q&A - The One Connector World

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off