Yeah I think a good FAQ/sticky thread about this subject in particular would be a good idea.
I'm pretty sure I have a good understanding of black level stuff like 0-255 and 16-235 as well as all the BTB/WTW stuff. However I'm pretty lost with the x:y:z business and what exactly constitutes "10-bit" and "12-bit" colour. For example with my PC (which I set up as Full RGB 0-255), my TV indicates a "10-bit" signal as long as I'm using HDMI to get the signal there. On signals below 10-bit (I'm guessing must be 8-bit?) the TV indicates nothing in that area of the display info bar.
On my PS3 though (also set to 0-255/Full RGB) my TV indicates "12-bit". Why/how that is, I'm not really sure about. Apparently only "pro graphics" PC videocards like FireGL and Quadro take can output 12-bit, but what makes 12-bit even useful or beneficial from the PS3 I'm not sure. I doubt my TV is anything more than a 10-bit panel, so I don't get why 12-bit matters.
Additionally what I don't get is why
they never built this stuff into HDMI handshaking and EDID? I mean if a TV doesn't accept or properly handle certain resolutions or audio formats, it will never get them because HDMI doesn't send sets stuff they can't receive. But with the black levels and colour spaces and all that, TVs just accept anything
and try to display it even if they won't be displayed correctly.
Hard to understand what the HDMI developers were thinking there. Some of this stuff is just as important as having the right resolution and audio formats, but that doesn't seem to matter. If your set can't handle 0-255? Well you can just send it anyway and everything will be dark and the blacks will be crushed, etc. Even if your set can
handle it (which most can these days), if you don't set it up right, it won't be right! WTF? Who thought that would be a good idea? LOL.
Another thing I don't get relates to the same thing but extending to DVI. As I understand it, DVI is always in RGB 0-255--correct me if I'm mistaken--and 16-235 is not supported/does not exist with a DVI connection (or with a DVI-HDMI connection). So that means you could never send a DVI display 16-235, nor could you send it YCbCr, correct? Fair enough, but if you use a DVI-HDMI connection you can send a display that only supports 16-235, full RGB anyway and it will crush the blacks as mentioned. But why is this even possible? If HDMI sets must accept RGB (because DVI connections work) why aren't they all
capable of 0-255 display? My oldest TV (which is not in use anymore) most certainly cannot; though it can display RGB (because clearly, there's an image) it's not capable of displaying 0-255. Why is this even possible?! Since DVI-HDMI connections are possible on all HDMI sets, I don't get how 0-255 would not be mandatory as well. Because, it's the only thing
you can get from DVI! (Again, as I understand it.) In other words if you hook up an older TV which only has HDMI to a DVI output, you will never
get the correct display despite the fact that you will get a
display/image. But nowhere in TV manuals do they ever state stuff like this!