Originally Posted by Richard Paul
I think that you are jumping the gun for two reasons. First off what is required for Deep Color support is that the HDMI chips are able to either transmit or receive a 36-bit RGB signal. That says nothing about what is required of the display and a 10-bit display output might be considered sufficient for a Deep Color display. Secondly just because they used a 30-bit RGB signal for the PS3 demonstration does not mean that the PS3 is not capable of 36-bit RGB output. That might have just been the limitation of the source material they used for the demonstration or perhaps even of the display that was used.
1.3a Spec p.86:
"Color depths greater than 24 bits are defined to be "Deep Color" modes. All Deep Color modes are optional though if an HDMI Source or Sink supports any Deep Color mode, it shall support 36-bit mode."
3a Spec p.4:
"shall _______ A key word indicating a mandatory requirement. Designers are [ITALIC]required[/ITALIC] to implement all such mandatory requirements."
Standard does not distinguish between chip and display, it talks about Sink and/or Source, and EDID associated with them describes capabilities of entire device, not its components. If display is 10-bit it cannot report it supports 12-bit regardless of capabilities of internal chip - otherwise it's cheating.
2. True, but guilty until proven or, at least, claimed otherwise.
To me it looks like many, maybe even most, of the Deep Color displays that are coming out this year are best suited for 30-bit RGB. In and of itself that is not a bad thing and 1 billion color combinations is a good improvement when compared to 16.7 million color combinations.
First off I'm not talking about displays, but about standard itself. Secondly, I don't feel like I need to drop on my knees just because they are going to release substandard products. There are plenty of them as it is.